Compare commits

...

14 Commits

Author SHA1 Message Date
a18c16b0b9 fix(shell): harden shared.sh and dfm for set -euo pipefail
Use ${VAR:-} defaults in shared.sh to prevent set -u failures on
unset variables (DOTFILES, ZSH_CUSTOM_COMPLETION_PATH, FPATH).
Export DOTFILES/BREWFILE/HOSTFILES in dfm so sourced scripts see them.
2026-02-08 01:12:39 +02:00
785a8e8eb7 fix(exports): prevent set -e abort when optional files are missing
Replace `[ -f ] && source` with `if/then/fi` for conditional source
lines so the file returns 0 even when optional exports files don't
exist. Also use `${VAR:-}` for XDG defaults to avoid set -u failures.
2026-02-08 01:11:55 +02:00
1cda859999 docs(claude): expand CLAUDE.md with msgr, dfm commands, gotchas, and hooks 2026-02-08 00:31:33 +02:00
bc69560da4 feat(claude): add shfmt/vendor hooks and shell-validate skill 2026-02-08 00:26:19 +02:00
2ee9407a43 feat(dfm): add 6 install commands and reorder install all into tiers 2026-02-07 23:41:51 +02:00
765c2fce72 test(dfm): expand bats tests from 1 to 16
Add tests for menu output of all sections (install, helpers, docs,
dotfiles, check, scripts, tests), routing of invalid input, install
menu completeness for all 19 entries, and check arch/host commands.
2026-02-07 23:20:02 +02:00
88eceaf194 fix(dfm): restrict cheat-databases glob to .sh files only
The install-cheat-* glob was matching .md documentation files, causing
errors when bash tried to execute them.
2026-02-07 22:45:08 +02:00
6d72003446 fix(lint): fix all sonarcloud detected issues (#279)
* fix(ci): replace broad permissions with specific scopes in workflows

Replace read-all/write-all with minimum required permission scopes
across all GitHub Actions workflows to follow the principle of least
privilege (SonarCloud rule githubactions:S8234).

* fix(shell): use [[ instead of [ for conditional tests

Replace single brackets with double brackets in bash conditional
expressions across 14 files (28 changes). All scripts use bash
shebangs so [[ is safe everywhere (SonarCloud rule shelldre:S7688).

* fix(shell): add explicit return statements to functions

Add return 0 as the last statement in ~46 shell functions across
17 files that previously relied on implicit return codes
(SonarCloud rule shelldre:S7682).

* fix(shell): assign positional parameters to local variables

Replace direct $1/$2/$3 usage with named local variables in _log(),
msg(), msg_err(), msg_done(), msg_run(), msg_ok(), and array_diff()
(SonarCloud rule shelldre:S7679).

* fix(python): replace dict() constructor with literal

Use {} instead of dict() for empty dictionary initialization
(SonarCloud rule python:S7498).

* fix(shell): fix husky shebang and tolerate npm outdated exit code

* docs(shell): add function docstring comments

* fix(shell): fix heredoc indentation in x-sonarcloud

* feat(python): add ruff linter and formatter configuration

* fix(ci): align megalinter config with biome, ruff, and shfmt settings

* fix(ci): disable black and yaml-prettier in megalinter config

* chore(ci): update ruff-pre-commit to v0.15.0 and fix hook name

* fix(scripts): check for .git dir before skipping clone in install-fonts

* fix(shell): address code review issues in scripts and shared.sh

- Guard wezterm show-keys failure in create-wezterm-keymaps.sh
- Stop masking git failures with return 0 in install-cheat-purebashbible.sh
- Add missing shared.sh source in install-xcode-cli-tools.sh
- Replace exit 1 with return 1 in sourced shared.sh

* fix(scripts): address code review and security findings

- Guard wezterm show-keys failure in create-wezterm-keymaps.sh
- Stop masking git failures with return 0 in install-cheat-purebashbible.sh
- Add missing shared.sh source in install-xcode-cli-tools.sh
- Replace exit 1 with return 1 in sourced shared.sh
- Remove shell=True subprocess calls in x-git-largest-files.py

* style(shell): apply shfmt formatting and add args to pre-commit hook

* fix(python): suppress bandit false positives in x-git-largest-files

* fix(python): add nosemgrep suppression for check_output call

* feat(format): add prettier for YAML formatting

Install prettier, add .prettierrc.json config (200-char width, 2-space
indent, LF endings), .prettierignore, yarn scripts (lint:prettier,
fix:prettier, format:yaml), and pre-commit hook scoped to YAML files.

* style(yaml): apply prettier formatting

* fix(scripts): address remaining code review findings

- Python: use list comprehension to filter empty strings instead of
  slicing off the last element
- create-wezterm-keymaps: write to temp file and mv for atomic updates
- install-xcode-cli-tools: fix shellcheck source directive path

* fix(python): sort imports alphabetically in x-git-largest-files

* fix(lint): disable PYTHON_ISORT in MegaLinter, ruff handles it

* chore(git): add __pycache__ to gitignore

* fix(python): rename ambiguous variable l to line (E741)

* style: remove trailing whitespace and blank lines

* style(fzf): apply shfmt formatting

* style(shell): apply shfmt formatting

* docs(plans): add design documents

* style(docs): add language specifier to fenced code block

* feat(lint): add markdown-table-formatter to dev tooling

Add markdown-table-formatter as a dev dependency with yarn scripts
(lint:md-table, fix:md-table) and a local pre-commit hook to
automatically format markdown tables on commit.
2026-02-07 19:01:02 +02:00
cff3d1dd8a feat(scripts): add x-sonarcloud script for LLM-driven issue analysis
Bridges LLM agents with SonarCloud's REST API to fetch and format
code quality issues as structured markdown with processing instructions.
2026-02-07 13:24:29 +02:00
a47ce85991 chore: remove hammerspoon type generator and types 2026-02-06 09:12:06 +02:00
13dd701eb7 feat(a): improve encryption script with better error handling
- Add dependency check for age and curl with install instructions
- Add --delete flag to remove originals after encryption
- Add -f/--force flag to control overwrite behavior
- Skip already-encrypted .age files during encryption
- Include hidden files (dotglob) when encrypting directories
- Handle empty directories gracefully with nullglob
- Allow flags in any position (proper option parsing)
- Add set -euo pipefail for better error handling
- Update documentation with all features and examples
- Bump version to 1.1.0
2026-02-06 01:51:01 +02:00
cfde007494 fix(shell): clean up rcfiles and remove redundancies
- Remove deprecated GREP_OPTIONS (handled via alias)
- Quote $ZSH_COMPDUMP to prevent word splitting
- Remove duplicate vim alias (nvim alias takes precedence)
- Consolidate completion path to ZSH_CUSTOM_COMPLETION_PATH
- Simplify PATH setup in rcfiles, centralize in exports
- Move LM Studio PATH from rcfiles to exports
- Add clarifying comments for macOS-specific ssh-add
2026-02-06 00:09:03 +02:00
ed4aa2ffe1 feat(scripts): add install-dnf-packages.sh for Fedora/RHEL 2026-02-05 23:46:10 +02:00
bcf11406b6 feat(scripts): add install-apt-packages.sh for Debian/Ubuntu
Install essential developer packages via apt:
- Build tools: build-essential, cmake, pkg-config, autoconf, automake, libtool
- Dev libraries: libssl-dev, libffi-dev, zlib1g-dev, libreadline-dev, etc.
- CLI utilities: jq, tmux, tree, unzip, shellcheck, socat, gnupg

Curated to avoid duplicates with cargo/go installs (ripgrep, fd, fzf, etc.).
Uses batched apt install for efficiency, exits gracefully on non-Debian systems.
2026-02-05 23:42:16 +02:00
94 changed files with 2621 additions and 19350 deletions

26
.claude/settings.json Normal file
View File

@@ -0,0 +1,26 @@
{
"hooks": {
"PreToolUse": [
{
"matcher": "Edit|Write",
"hooks": [
{
"type": "command",
"command": "fp=$(cat | jq -r '.tool_input.file_path // empty') && [ -n \"$fp\" ] && case \"$fp\" in */fzf-tmux|*/yarn.lock|*/.yarn/*) echo \"BLOCKED: $fp is a vendor/lock file — do not edit directly\" >&2; exit 2;; esac; exit 0"
}
]
}
],
"PostToolUse": [
{
"matcher": "Edit|Write",
"hooks": [
{
"type": "command",
"command": "fp=$(cat | jq -r '.tool_input.file_path // empty') && [ -n \"$fp\" ] && [ -f \"$fp\" ] && case \"$fp\" in *.sh|*/bin/*) head -1 \"$fp\" | grep -qE '^#!.*(ba)?sh' && command -v shfmt > /dev/null && shfmt -i 2 -bn -ci -sr -fn -w \"$fp\";; esac; exit 0"
}
]
}
]
}
}

View File

@@ -0,0 +1,37 @@
---
name: shell-validate
description: Validate shell scripts after editing. Apply when writing or modifying any shell script in local/bin/ or scripts/.
user-invocable: false
allowed-tools: Bash, Read, Grep
---
After editing any shell script in `local/bin/`, `scripts/`, or `config/` (files with a `#!` shebang or `# shellcheck shell=` directive), validate it:
## 1. Determine the shell
- `/bin/sh` or `#!/usr/bin/env sh` shebang -> POSIX, use `sh -n`
- `/bin/bash` or `#!/usr/bin/env bash` shebang -> Bash, use `bash -n`
- `# shellcheck shell=bash` directive (no shebang) -> use `bash -n`
- `# shellcheck shell=sh` directive (no shebang) -> use `sh -n`
- No shebang and no directive -> default to `bash -n`
## 2. Syntax check
Run the appropriate syntax checker:
```bash
bash -n <file> # for bash scripts
sh -n <file> # for POSIX sh scripts
```
If syntax check fails, fix the issue before proceeding.
## 3. ShellCheck
Run `shellcheck <file>`. The project `.shellcheckrc` already disables SC2039, SC2166, SC2154, SC1091, SC2174, SC2016. Only report and fix warnings that are NOT in that exclude list.
## Key files to never validate (not shell scripts)
- `local/bin/fzf-tmux` (vendor file)
- `*.md` files
- `*.bats` test files (Bats, not plain shell)

View File

@@ -8,6 +8,10 @@ indent_style = space
insert_final_newline = true insert_final_newline = true
trim_trailing_whitespace = true trim_trailing_whitespace = true
[*.py]
indent_size = 4
max_line_length = 120
[*.fish] [*.fish]
max_line_length = 120 max_line_length = 120

6
.github/README.md vendored
View File

@@ -37,7 +37,7 @@ see what interesting stuff you've done with it. Sharing is caring.
### Interesting folders ### Interesting folders
| Path | Description | | Path | Description |
| ------------------- | -------------------------------------------- | |---------------------|----------------------------------------------|
| `.github` | GitHub Repository configuration files, meta. | | `.github` | GitHub Repository configuration files, meta. |
| `hosts/{hostname}/` | Configs that should apply to that host only. | | `hosts/{hostname}/` | Configs that should apply to that host only. |
| `local/bin` | Helper scripts that I've collected or wrote. | | `local/bin` | Helper scripts that I've collected or wrote. |
@@ -52,7 +52,7 @@ is processed by Dotbot during installation.
### dotfile folders ### dotfile folders
| Repo | Destination | Description | | Repo | Destination | Description |
| --------- | ----------- | ------------------------------------------- | |-----------|-------------|---------------------------------------------|
| `base/` | `.*` | `$HOME` level files. | | `base/` | `.*` | `$HOME` level files. |
| `config/` | `.config/` | Configurations for applications. | | `config/` | `.config/` | Configurations for applications. |
| `local/` | `.local/` | XDG Base folder: `bin`, `share` and `state` | | `local/` | `.local/` | XDG Base folder: `bin`, `share` and `state` |
@@ -86,7 +86,7 @@ The folder structure follows [XDG Base Directory Specification][xdg] where possi
### XDG Variables ### XDG Variables
| Env | Default | Short description | | Env | Default | Short description |
| ------------------ | -------------------- | ---------------------------------------------- | |--------------------|----------------------|------------------------------------------------|
| `$XDG_BIN_HOME` | `$HOME/.local/bin` | Local binaries | | `$XDG_BIN_HOME` | `$HOME/.local/bin` | Local binaries |
| `$XDG_CONFIG_HOME` | `$HOME/.config` | User-specific configs | | `$XDG_CONFIG_HOME` | `$HOME/.config` | User-specific configs |
| `$XDG_DATA_HOME` | `$HOME/.local/share` | User-specific data files | | `$XDG_DATA_HOME` | `$HOME/.local/share` | User-specific data files |

View File

@@ -9,13 +9,15 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true cancel-in-progress: true
permissions: read-all permissions:
contents: read
jobs: jobs:
debug-changelog: debug-changelog:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: write-all permissions:
contents: read
steps: steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2 - uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -27,7 +29,7 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}
config_file: .github/tag-changelog-config.js config_file: .github/tag-changelog-config.js
- name: 'Echo results' - name: "Echo results"
id: output-changelog id: output-changelog
run: | run: |
echo "${{ steps.changelog.outputs.changes }}" echo "${{ steps.changelog.outputs.changes }}"

View File

@@ -11,7 +11,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true cancel-in-progress: true
permissions: read-all permissions:
contents: read
jobs: jobs:
Linter: Linter:

View File

@@ -5,19 +5,21 @@ name: Release Daily State
on: on:
workflow_dispatch: workflow_dispatch:
schedule: schedule:
- cron: '0 21 * * *' # 00:00 at Europe/Helsinki - cron: "0 21 * * *" # 00:00 at Europe/Helsinki
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true cancel-in-progress: true
permissions: read-all permissions:
contents: read
jobs: jobs:
new-daily-release: new-daily-release:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: write-all permissions:
contents: write
outputs: outputs:
created: ${{ steps.daily-version.outputs.created }} created: ${{ steps.daily-version.outputs.created }}

View File

@@ -5,14 +5,15 @@ name: Pre-commit autoupdate
on: on:
schedule: schedule:
# At 04:00 on Monday and Thursday. # At 04:00 on Monday and Thursday.
- cron: '0 4 * * 1,4' - cron: "0 4 * * 1,4"
workflow_dispatch: workflow_dispatch:
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true cancel-in-progress: true
permissions: read-all permissions:
contents: read
jobs: jobs:
auto-update: auto-update:
@@ -33,6 +34,6 @@ jobs:
with: with:
token: ${{ secrets.GITHUB_TOKEN }} token: ${{ secrets.GITHUB_TOKEN }}
branch: update/pre-commit-hooks branch: update/pre-commit-hooks
title: 'chore: update pre-commit hooks' title: "chore: update pre-commit hooks"
commit-message: 'chore: update pre-commit hooks' commit-message: "chore: update pre-commit hooks"
body: Update versions of pre-commit hooks to latest version. body: Update versions of pre-commit hooks to latest version.

View File

@@ -14,7 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true cancel-in-progress: true
permissions: read-all permissions:
pull-requests: read
jobs: jobs:
semantic-pr: semantic-pr:

View File

@@ -11,7 +11,7 @@ on:
- .github/workflows/sync-labels.yml - .github/workflows/sync-labels.yml
- .github/labels.yml - .github/labels.yml
schedule: schedule:
- cron: '34 5 * * *' - cron: "34 5 * * *"
workflow_call: workflow_call:
workflow_dispatch: workflow_dispatch:
@@ -19,7 +19,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true cancel-in-progress: true
permissions: read-all permissions:
contents: read
jobs: jobs:
SyncLabels: SyncLabels:

View File

@@ -5,20 +5,22 @@ name: Update submodules
on: on:
schedule: schedule:
# At 04:00 on Monday and Thursday. # At 04:00 on Monday and Thursday.
- cron: '0 4 * * 1' - cron: "0 4 * * 1"
workflow_dispatch: workflow_dispatch:
concurrency: concurrency:
group: ${{ github.workflow }}-${{ github.ref }} group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true cancel-in-progress: true
permissions: read-all permissions:
contents: read
jobs: jobs:
update-submodules: update-submodules:
runs-on: ubuntu-latest runs-on: ubuntu-latest
permissions: write-all permissions:
contents: write
steps: steps:
- name: Checkout repository - name: Checkout repository

1
.gitignore vendored
View File

@@ -56,5 +56,6 @@ local/man/yabai.1
local/share/fonts/* local/share/fonts/*
lock lock
node_modules node_modules
__pycache__
ssh/local.d/* ssh/local.d/*
config/fish/fish_variables* config/fish/fish_variables*

View File

@@ -9,16 +9,21 @@ VALIDATE_ALL_CODEBASE: true
FILEIO_REPORTER: false # Generate file.io report FILEIO_REPORTER: false # Generate file.io report
GITHUB_STATUS_REPORTER: true # Generate GitHub status report GITHUB_STATUS_REPORTER: true # Generate GitHub status report
IGNORE_GENERATED_FILES: true # Ignore generated files IGNORE_GENERATED_FILES: true # Ignore generated files
JAVASCRIPT_DEFAULT_STYLE: prettier # Default style for JavaScript
PRINT_ALPACA: false # Print Alpaca logo in console PRINT_ALPACA: false # Print Alpaca logo in console
SARIF_REPORTER: true # Generate SARIF report SARIF_REPORTER: true # Generate SARIF report
SHOW_SKIPPED_LINTERS: false # Show skipped linters in MegaLinter log SHOW_SKIPPED_LINTERS: false # Show skipped linters in MegaLinter log
TYPESCRIPT_DEFAULT_STYLE: prettier # Default style for TypeScript
DISABLE_LINTERS: DISABLE_LINTERS:
- REPOSITORY_DEVSKIM - REPOSITORY_DEVSKIM
- JAVASCRIPT_ES # using biome - JAVASCRIPT_ES # using biome
- JAVASCRIPT_PRETTIER # using biome - JAVASCRIPT_PRETTIER # using biome
- TYPESCRIPT_PRETTIER # using biome
- JSON_PRETTIER # using biome
- PYTHON_BLACK # using ruff
- PYTHON_FLAKE8 # using ruff
- PYTHON_PYLINT # using ruff
- PYTHON_ISORT # using ruff (I rules)
YAML_YAMLLINT_CONFIG_FILE: .yamllint.yml YAML_YAMLLINT_CONFIG_FILE: .yamllint.yml
REPOSITORY_GIT_DIFF_DISABLE_ERRORS: true REPOSITORY_GIT_DIFF_DISABLE_ERRORS: true
BASH_SHFMT_ARGUMENTS: -i 2 -bn -ci -sr -fn
FILTER_REGEX_EXCLUDE: > FILTER_REGEX_EXCLUDE: >
(node_modules|tools|config/cheat/cheatsheets/community|config/cheat/cheatsheets/tldr|config/fzf|config/zsh|config/tmux/plugins) (node_modules|tools|config/cheat/cheatsheets/community|config/cheat/cheatsheets/tldr|config/fzf|config/zsh|config/tmux/plugins)

View File

@@ -28,12 +28,25 @@ repos:
entry: yarn biome check --write --files-ignore-unknown=true --no-errors-on-unmatched entry: yarn biome check --write --files-ignore-unknown=true --no-errors-on-unmatched
language: system language: system
files: \.(js|ts|jsx|tsx|json|md)$ files: \.(js|ts|jsx|tsx|json|md)$
- id: markdown-table-formatter
name: Markdown Table Formatter
entry: yarn markdown-table-formatter
language: system
types: [markdown]
- repo: https://github.com/adrienverge/yamllint - repo: https://github.com/adrienverge/yamllint
rev: v1.38.0 rev: v1.38.0
hooks: hooks:
- id: yamllint - id: yamllint
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v4.0.0-alpha.8
hooks:
- id: prettier
types_or: [yaml]
additional_dependencies:
- prettier@3.8.1
- repo: https://github.com/shellcheck-py/shellcheck-py - repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.11.0.1 rev: v0.11.0.1
hooks: hooks:
@@ -43,6 +56,7 @@ repos:
rev: v3.12.0-2 rev: v3.12.0-2
hooks: hooks:
- id: shfmt - id: shfmt
args: [-i, "2", -bn, -ci, -sr, -fn, -w]
- repo: https://github.com/rhysd/actionlint - repo: https://github.com/rhysd/actionlint
rev: v1.7.10 rev: v1.7.10
@@ -60,3 +74,10 @@ repos:
hooks: hooks:
- id: fish_syntax - id: fish_syntax
- id: fish_indent - id: fish_indent
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.15.0
hooks:
- id: ruff-check
args: [--fix]
- id: ruff-format

18
.prettierignore Normal file
View File

@@ -0,0 +1,18 @@
node_modules
.yarn
.pnp.*
.mypy_cache
Brewfile.lock.json
lazy-lock.json
config/cheat/cheatsheets/community
config/cheat/cheatsheets/tldr
config/fzf
config/nvim
config/op/plugins/used_plugins
config/tmux/plugins
config/vim/plugged
config/zsh
local/bin/antigen.zsh
local/bin/asdf
tools
docs/plans

9
.prettierrc.json Normal file
View File

@@ -0,0 +1,9 @@
{
"$schema": "https://json.schemastore.org/prettierrc",
"printWidth": 200,
"tabWidth": 2,
"useTabs": false,
"endOfLine": "lf",
"singleQuote": false,
"proseWrap": "preserve"
}

View File

@@ -13,11 +13,11 @@ ignore_all_files_in_gitignore: true
# Was previously called `ignored_dirs`, please update your config if you are using that. # Was previously called `ignored_dirs`, please update your config if you are using that.
# Added (renamed) on 2025-04-07 # Added (renamed) on 2025-04-07
ignored_paths: ignored_paths:
- '*.swp' - "*.swp"
- '*.tmp' - "*.tmp"
- '*.tmp.*' - "*.tmp.*"
- '.DS_Store' - ".DS_Store"
- '.git/**' - ".git/**"
- /config/cheat/cheatsheets/community/** - /config/cheat/cheatsheets/community/**
- /config/cheat/cheatsheets/pure-bash-bible/** - /config/cheat/cheatsheets/pure-bash-bible/**
- /config/cheat/cheatsheets/tldr/** - /config/cheat/cheatsheets/tldr/**
@@ -85,6 +85,6 @@ excluded_tools: []
# initial prompt for the project. It will always be given to the LLM upon activating the project # initial prompt for the project. It will always be given to the LLM upon activating the project
# (contrary to the memories, which are loaded on demand). # (contrary to the memories, which are loaded on demand).
initial_prompt: '' initial_prompt: ""
project_name: '.dotfiles' project_name: ".dotfiles"

157
CLAUDE.md Normal file
View File

@@ -0,0 +1,157 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code)
when working with code in this repository.
## Repository Overview
Personal dotfiles repository for Ismo Vuorinen.
Uses **Dotbot** (not GNU Stow) to symlink configuration files into place.
The directory layout follows the XDG Base Directory Specification.
## Directory Layout and Linking
| Source | Destination | Notes |
|---------------------|-------------------|-------------------------------------------|
| `base/*` | `~/.*` | Home-level dotfiles (`.` added by Dotbot) |
| `config/*` | `~/.config/` | Application configurations |
| `local/bin/*` | `~/.local/bin/` | Helper scripts and utilities |
| `local/share/*` | `~/.local/share/` | Data files |
| `local/man/**` | `~/.local/man/` | Manual pages |
| `ssh/*` | `~/.ssh/` | SSH configuration (mode 0600) |
| `hosts/<hostname>/` | Overlays | Host-specific overrides |
Installation: `./install` runs Dotbot with `install.conf.yaml`,
then applies `hosts/<hostname>/install.conf.yaml` if it exists.
## Commands
```bash
# Install dependencies (required before lint/test)
yarn install
# Linting
yarn lint # Run biome + prettier + editorconfig-checker
yarn lint:biome # Biome only
yarn lint:ec # EditorConfig checker only
yarn lint:md-table # Markdown table formatting check
yarn fix:md-table # Auto-fix markdown tables
# Formatting
yarn fix:biome # Autofix with biome (JS/TS/JSON/MD)
yarn fix:prettier # Autofix with prettier (YAML)
yarn format # Format with biome
yarn format:yaml # Format YAML files with prettier
# Testing (Bats - Bash Automated Testing System)
yarn test # Run all tests in tests/
# Run a single test file:
./node_modules/.bin/bats tests/dfm.bats
# Shell linting
shellcheck <script> # Lint shell scripts
```
## Pre-commit Hooks
Configured in `.pre-commit-config.yaml`: shellcheck, shfmt, biome,
yamllint, prettier, actionlint, stylua, fish_syntax/fish_indent.
Run `pre-commit run --all-files` to check everything.
## Commit Convention
Semantic Commit messages: `type(scope): summary`
(e.g., `fix(tmux): correct prefix binding`).
Enforced by commitlint extending `@ivuorinen/commitlint-config`.
## Architecture
### Shell Configuration Chain
Both `base/bashrc` and `base/zshrc` source `config/shared.sh`,
which loads:
- `config/exports` — environment variables, XDG dirs, PATH
- `config/alias` — shell aliases
Zsh additionally uses **antidote** (in `tools/antidote/`)
for plugin management and **oh-my-posh** for the prompt.
### msgr — Messaging Helper
`local/bin/msgr` provides colored output functions (`msgr msg`,
`msgr run`, `msgr yay`, `msgr err`, `msgr warn`). Sourced by `dfm`
and most scripts in `local/bin/`.
### dfm — Dotfiles Manager
`local/bin/dfm` is the main management script. Key commands:
- `dfm install all` — install everything in tiered stages
- `dfm brew install` / `dfm brew update` — Homebrew management
- `dfm apt upkeep` — APT package maintenance (Debian/Ubuntu)
- `dfm dotfiles fmt` / `dfm dotfiles shfmt` — format configs/scripts
- `dfm helpers <name>` — inspect aliases, colors, env, functions, path
- `dfm docs all` — regenerate documentation under `docs/`
- `dfm check arch` / `dfm check host` — system info
- `dfm scripts` — run scripts from `scripts/` (discovered via `@description` tags)
- `dfm tests` — test visualization helpers
### Submodules
External dependencies are git submodules (Dotbot, plugins,
tmux plugins, cheatsheets, antidote).
Managed by `add-submodules.sh`. All set to `ignore = dirty`.
Updated automatically via GitHub Actions on a schedule.
### Host-specific Configs
Machine-specific overrides live in `hosts/<hostname>/`
with their own `base/`, `config/`, and `install.conf.yaml`.
These are layered on top of the global config during installation.
## Code Style
- **EditorConfig**: 2-space indent, UTF-8, LF line endings.
See `.editorconfig` for per-filetype overrides
(4-space for PHP/fish, tabs for git config).
- **Shell scripts**: Must have a shebang or
`# shellcheck shell=bash` directive.
Follow shfmt settings in `.editorconfig`
(2-space indent, `binary_next_line`,
`switch_case_indent`, `space_redirects`, `function_next_line`).
- **Lua** (neovim config): Formatted with stylua (`stylua.toml`),
90-char line length.
- **JSON/JS/TS/Markdown**: Formatted with Biome (`biome.json`),
80-char width.
- **YAML**: Formatted with Prettier (`.prettierrc.json`),
validated with yamllint (`.yamllint.yml`).
## ShellCheck Disabled Rules
Defined in `.shellcheckrc`:
SC2039 (POSIX `local`), SC2166 (`-o` in test),
SC2154 (unassigned variables), SC1091 (source following),
SC2174 (mkdir -p -m), SC2016 (single-quote expressions).
## Gotchas
- **POSIX scripts**: `x-ssh-audit`, `x-codeql`, `x-until-error`,
`x-until-success`, `x-ssl-expiry-date` use `/bin/sh`.
Validate with `sh -n`, not `bash -n`.
- **Vendor file**: `local/bin/fzf-tmux` is vendored from
junegunn/fzf — do not modify.
- **Fish config**: `config/fish/` has its own config chain
(`config.fish`, `exports.fish`, `alias.fish`) plus 80+ functions.
- **Python**: Two scripts (`x-compare-versions.py`,
`x-git-largest-files.py`) linted by Ruff (config in `pyproject.toml`).
## Claude Code Configuration
- **Hooks** (`.claude/settings.json`):
- *PreToolUse*: Blocks edits to `fzf-tmux`, `yarn.lock`, `.yarn/`
- *PostToolUse*: Auto-runs `shfmt` on shell scripts after Edit/Write
- **Skills** (`.claude/skills/`):
- `shell-validate`: Auto-validates shell scripts (syntax + shellcheck)
## Package Manager
Yarn (v4.12.0) is the package manager. Do not use npm.

View File

@@ -42,19 +42,25 @@ done
# Mark certain repositories shallow # Mark certain repositories shallow
git config -f .gitmodules submodule.antidote.shallow true git config -f .gitmodules submodule.antidote.shallow true
_log() { # Log a message using msgr if available, else echo
_log()
{
local msg="$1"
if command -v msgr > /dev/null 2>&1; then if command -v msgr > /dev/null 2>&1; then
msgr run_done "$1" msgr run_done "$msg"
else else
echo " [ok] $1" echo " [ok] $msg"
fi fi
return 0
} }
remove_old_submodule() { # Remove a stale git submodule and clean up references
remove_old_submodule()
{
local name="$1" path="$2" local name="$1" path="$2"
# Remove working tree # Remove working tree
if [ -d "$path" ]; then if [[ -d "$path" ]]; then
rm -rf "$path" rm -rf "$path"
_log "Removed $path" _log "Removed $path"
fi fi
@@ -66,13 +72,13 @@ remove_old_submodule() {
git config --remove-section "submodule.$path" 2> /dev/null || true git config --remove-section "submodule.$path" 2> /dev/null || true
# Skip name-based cleanup if no submodule name provided # Skip name-based cleanup if no submodule name provided
[ -z "$name" ] && return 0 [[ -z "$name" ]] && return 0
# Remove .git/config section keyed by name # Remove .git/config section keyed by name
git config --remove-section "submodule.$name" 2> /dev/null || true git config --remove-section "submodule.$name" 2> /dev/null || true
# Remove .git/modules/<name>/ cached repository # Remove .git/modules/<name>/ cached repository
if [ -d ".git/modules/$name" ]; then if [[ -d ".git/modules/$name" ]]; then
rm -rf ".git/modules/$name" rm -rf ".git/modules/$name"
_log "Removed .git/modules/$name" _log "Removed .git/modules/$name"
fi fi

View File

@@ -2,6 +2,7 @@
# shellcheck shell=bash # shellcheck shell=bash
export DOTFILES="$HOME/.dotfiles" export DOTFILES="$HOME/.dotfiles"
# Minimal PATH for x-have and utilities; full PATH set in shared.sh/exports
export PATH="$HOME/.local/bin:$DOTFILES/local/bin:$PATH" export PATH="$HOME/.local/bin:$DOTFILES/local/bin:$PATH"
export SHARED_SCRIPTS_SOURCED=0 export SHARED_SCRIPTS_SOURCED=0
@@ -11,7 +12,7 @@ source "$DOTFILES/config/shared.sh"
[ -f "${DOTFILES}/config/fzf/fzf.bash" ] && [ -f "${DOTFILES}/config/fzf/fzf.bash" ] &&
source "${DOTFILES}/config/fzf/fzf.bash" source "${DOTFILES}/config/fzf/fzf.bash"
# Import ssh keys in keychain # Import ssh keys in keychain (macOS-specific -A flag, silently fails on Linux)
ssh-add -A 2>/dev/null ssh-add -A 2>/dev/null
x-have antidot && { x-have antidot && {
@@ -21,6 +22,3 @@ x-have antidot && {
PROMPT_DIRTRIM=3 PROMPT_DIRTRIM=3
PROMPT_COMMAND='PS1_CMD1=$(git branch --show-current 2>/dev/null)' PROMPT_COMMAND='PS1_CMD1=$(git branch --show-current 2>/dev/null)'
PS1='\[\e[95m\]\u\[\e[0m\]@\[\e[38;5;22;2m\]\h\[\e[0m\] \[\e[38;5;33m\]\w\[\e[0m\] \[\e[92;2m\]${PS1_CMD1}\n\[\e[39m\]➜\[\e[0m\] ' PS1='\[\e[95m\]\u\[\e[0m\]@\[\e[38;5;22;2m\]\h\[\e[0m\] \[\e[38;5;33m\]\w\[\e[0m\] \[\e[92;2m\]${PS1_CMD1}\n\[\e[39m\]➜\[\e[0m\] '
# Added by LM Studio CLI (lms)
export PATH="$PATH:$HOME/.lmstudio/bin"

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -7,18 +7,13 @@
autoload -U promptinit; promptinit autoload -U promptinit; promptinit
export DOTFILES="$HOME/.dotfiles" export DOTFILES="$HOME/.dotfiles"
LOCAL_SHARE="$HOME/.local/share" # Minimal PATH for x-have and utilities; full PATH set in shared.sh/exports
export PATH="$HOME/.local/bin:$DOTFILES/local/bin:$LOCAL_SHARE/nvim/mason/bin:$LOCAL_SHARE/bob/nvim-bin:$LOCAL_SHARE/cargo/bin:/opt/homebrew/bin:/usr/local/bin:$PATH" export PATH="$HOME/.local/bin:$DOTFILES/local/bin:$PATH"
export SHARED_SCRIPTS_SOURCED=0 export SHARED_SCRIPTS_SOURCED=0
source "$DOTFILES/config/shared.sh" source "$DOTFILES/config/shared.sh"
# zsh completions directory # zsh completions directory (ZSH_CUSTOM_COMPLETION_PATH set in shared.sh)
[ -z "$ZSH_COMPLETIONS" ] && export ZSH_COMPLETIONS="$XDG_CONFIG_HOME/zsh/completion"
# Add zsh completions to FPATH, compinit will be called later
FPATH="$ZSH_COMPLETIONS:$FPATH"
ZSH_COMPDUMP="$XDG_CACHE_HOME/zsh/zcompdump-${SHORT_HOST}-${ZSH_VERSION}" ZSH_COMPDUMP="$XDG_CACHE_HOME/zsh/zcompdump-${SHORT_HOST}-${ZSH_VERSION}"
source "$DOTFILES/config/zsh/antidote.zsh" source "$DOTFILES/config/zsh/antidote.zsh"
@@ -37,12 +32,9 @@ source_fzf_config
x-have antidot && eval "$(antidot init)" x-have antidot && eval "$(antidot init)"
autoload -Uz compinit bashcompinit autoload -Uz compinit bashcompinit
compinit -d $ZSH_COMPDUMP compinit -d "$ZSH_COMPDUMP"
bashcompinit bashcompinit
# To customize prompt, run `p10k configure` or edit ~/.p10k.zsh. # To customize prompt, run `p10k configure` or edit ~/.p10k.zsh.
export P10K_CONFIG="$DOTFILES/config/zsh/p10k.zsh" export P10K_CONFIG="$DOTFILES/config/zsh/p10k.zsh"
[[ ! -f "$P10K_CONFIG" ]] || source "$P10K_CONFIG" [[ ! -f "$P10K_CONFIG" ]] || source "$P10K_CONFIG"
# Added by LM Studio CLI (lms)
export PATH="$PATH:$HOME/.lmstudio/bin"

View File

@@ -7,8 +7,6 @@ x-have eza && {
alias ls="eza -h -s=type --git --icons --group-directories-first" alias ls="eza -h -s=type --git --icons --group-directories-first"
} }
alias vim='vim -u "$XDG_CONFIG_HOME/vim/vimrc"'
# Easier navigation: .., ..., .... # Easier navigation: .., ..., ....
alias ..="cd .." alias ..="cd .."
alias ...="cd ../.." alias ...="cd ../.."

View File

@@ -93,13 +93,13 @@ expand-main:
# Note that not all layouts respond to this command. # Note that not all layouts respond to this command.
increase-main: increase-main:
mod: mod1 mod: mod1
key: ',' key: ","
# Decrease the number of windows in the main pane. # Decrease the number of windows in the main pane.
# Note that not all layouts respond to this command. # Note that not all layouts respond to this command.
decrease-main: decrease-main:
mod: mod1 mod: mod1
key: '.' key: "."
# General purpose command for custom layouts. # General purpose command for custom layouts.
# Functionality is layout-dependent. # Functionality is layout-dependent.

View File

@@ -4,15 +4,15 @@
# Set XDG directories if not already set # Set XDG directories if not already set
# https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html # https://specifications.freedesktop.org/basedir-spec/basedir-spec-latest.html
[ -z "$XDG_CONFIG_HOME" ] && export XDG_CONFIG_HOME="$HOME/.config" [ -z "${XDG_CONFIG_HOME:-}" ] && export XDG_CONFIG_HOME="$HOME/.config"
[ -z "$XDG_DATA_HOME" ] && export XDG_DATA_HOME="$HOME/.local/share" [ -z "${XDG_DATA_HOME:-}" ] && export XDG_DATA_HOME="$HOME/.local/share"
[ -z "$XDG_CACHE_HOME" ] && export XDG_CACHE_HOME="$HOME/.cache" [ -z "${XDG_CACHE_HOME:-}" ] && export XDG_CACHE_HOME="$HOME/.cache"
[ -z "$XDG_STATE_HOME" ] && export XDG_STATE_HOME="$HOME/.local/state" [ -z "${XDG_STATE_HOME:-}" ] && export XDG_STATE_HOME="$HOME/.local/state"
[ -z "$XDG_BIN_HOME" ] && export XDG_BIN_HOME="$HOME/.local/bin" [ -z "${XDG_BIN_HOME:-}" ] && export XDG_BIN_HOME="$HOME/.local/bin"
[ -z "$XDG_RUNTIME_DIR" ] && export XDG_RUNTIME_DIR="$HOME/.local/run" [ -z "${XDG_RUNTIME_DIR:-}" ] && export XDG_RUNTIME_DIR="$HOME/.local/run"
# if DOTFILES is not set, set it to the default location # if DOTFILES is not set, set it to the default location
[ -z "$DOTFILES" ] && export DOTFILES="$HOME/.dotfiles" [ -z "${DOTFILES:-}" ] && export DOTFILES="$HOME/.dotfiles"
export PATH="$XDG_BIN_HOME:$DOTFILES/local/bin:$XDG_DATA_HOME/bob/nvim-bin:$XDG_DATA_HOME/cargo/bin:/opt/homebrew/bin:/usr/local/bin:$PATH" export PATH="$XDG_BIN_HOME:$DOTFILES/local/bin:$XDG_DATA_HOME/bob/nvim-bin:$XDG_DATA_HOME/cargo/bin:/opt/homebrew/bin:/usr/local/bin:$PATH"
@@ -150,6 +150,7 @@ commit()
git commit -a -m "$commitMessage" git commit -a -m "$commitMessage"
} }
# Run Laravel scheduler in a loop
scheduler() scheduler()
{ {
while :; do while :; do
@@ -282,7 +283,8 @@ export LESSHISTFILE="$XDG_STATE_HOME"/less/history
export MANPAGER="less -X" export MANPAGER="less -X"
# Always enable colored `grep` output # Always enable colored `grep` output
export GREP_OPTIONS="--color=auto" # Note: GREP_OPTIONS is deprecated since GNU grep 2.21
# Color is handled via alias in config/alias
# check the window size after each command and, if necessary, # check the window size after each command and, if necessary,
# update the values of LINES and COLUMNS. # update the values of LINES and COLUMNS.
@@ -436,15 +438,19 @@ msg "Setting up Wakatime configuration"
export WAKATIME_HOME="$XDG_STATE_HOME/wakatime" export WAKATIME_HOME="$XDG_STATE_HOME/wakatime"
x-dc "$WAKATIME_HOME" x-dc "$WAKATIME_HOME"
# LM Studio CLI
msg "Setting up LM Studio configuration"
export PATH="$PATH:$HOME/.lmstudio/bin"
# Misc # Misc
msg "Setting up miscellaneous configuration" msg "Setting up miscellaneous configuration"
export ZSHZ_DATA="$XDG_STATE_HOME/z" export ZSHZ_DATA="$XDG_STATE_HOME/z"
export CHEAT_USE_FZF=true export CHEAT_USE_FZF=true
export SQLITE_HISTORY="${XDG_CACHE_HOME}/sqlite_history" export SQLITE_HISTORY="${XDG_CACHE_HOME}/sqlite_history"
[ -f "$XDG_CONFIG_HOME/exports-secret" ] && source "$XDG_CONFIG_HOME/exports-secret" if [ -f "$XDG_CONFIG_HOME/exports-secret" ]; then source "$XDG_CONFIG_HOME/exports-secret"; fi
[ -f "$XDG_CONFIG_HOME/exports-local" ] && source "$XDG_CONFIG_HOME/exports-local" if [ -f "$XDG_CONFIG_HOME/exports-local" ]; then source "$XDG_CONFIG_HOME/exports-local"; fi
# shellcheck source=./exports-lakka # shellcheck source=./exports-lakka
[ -f "$XDG_CONFIG_HOME/exports-$(hostname)" ] && source "$XDG_CONFIG_HOME/exports-$(hostname)" if [ -f "$XDG_CONFIG_HOME/exports-$(hostname)" ]; then source "$XDG_CONFIG_HOME/exports-$(hostname)"; fi
# shellcheck source=./exports-lakka-secret # shellcheck source=./exports-lakka-secret
[ -f "$XDG_CONFIG_HOME/exports-$(hostname)-secret" ] && source "$XDG_CONFIG_HOME/exports-$(hostname)-secret" if [ -f "$XDG_CONFIG_HOME/exports-$(hostname)-secret" ]; then source "$XDG_CONFIG_HOME/exports-$(hostname)-secret"; fi

View File

@@ -7,65 +7,67 @@ To be used with a companion fish function like this:
""" """
from __future__ import print_function
import json import json
import os import os
import signal import signal
import subprocess import subprocess
import sys import sys
import traceback
BASH = "bash"
BASH = 'bash'
FISH_READONLY = [ FISH_READONLY = [
'PWD', 'SHLVL', 'history', 'pipestatus', 'status', 'version', "PWD",
'FISH_VERSION', 'fish_pid', 'hostname', '_', 'fish_private_mode' "SHLVL",
"history",
"pipestatus",
"status",
"version",
"FISH_VERSION",
"fish_pid",
"hostname",
"_",
"fish_private_mode",
] ]
IGNORED = [ IGNORED = ["PS1", "XPC_SERVICE_NAME"]
'PS1', 'XPC_SERVICE_NAME'
]
def ignored(name): def ignored(name):
if name == 'PWD': # this is read only, but has special handling if name == "PWD": # this is read only, but has special handling
return False return False
# ignore other read only variables # ignore other read only variables
if name in FISH_READONLY: if name in FISH_READONLY:
return True return True
if name in IGNORED or name.startswith("BASH_FUNC"): if name in IGNORED or name.startswith("BASH_FUNC"):
return True return True
if name.startswith('%'): return name.startswith("%")
return True
return False
def escape(string): def escape(string):
# use json.dumps to reliably escape quotes and backslashes # use json.dumps to reliably escape quotes and backslashes
return json.dumps(string).replace(r'$', r'\$') return json.dumps(string).replace(r"$", r"\$")
def escape_identifier(word): def escape_identifier(word):
return escape(word.replace('?', '\\?')) return escape(word.replace("?", "\\?"))
def comment(string): def comment(string):
return '\n'.join(['# ' + line for line in string.split('\n')]) return "\n".join(["# " + line for line in string.split("\n")])
def gen_script(): def gen_script():
# Use the following instead of /usr/bin/env to read environment so we can # Use the following instead of /usr/bin/env to read environment so we can
# deal with multi-line environment variables (and other odd cases). # deal with multi-line environment variables (and other odd cases).
env_reader = "%s -c 'import os,json; print(json.dumps({k:v for k,v in os.environ.items()}))'" % (sys.executable) env_reader = f"{sys.executable} -c 'import os,json; print(json.dumps({{k:v for k,v in os.environ.items()}}))'"
args = [BASH, '-c', env_reader] args = [BASH, "-c", env_reader]
output = subprocess.check_output(args, universal_newlines=True) output = subprocess.check_output(args, universal_newlines=True)
old_env = output.strip() old_env = output.strip()
pipe_r, pipe_w = os.pipe() pipe_r, pipe_w = os.pipe()
if sys.version_info >= (3, 4):
os.set_inheritable(pipe_w, True) os.set_inheritable(pipe_w, True)
command = 'eval $1 && ({}; alias) >&{}'.format( command = f"eval $1 && ({env_reader}; alias) >&{pipe_w}"
env_reader, args = [BASH, "-c", command, "bass", " ".join(sys.argv[1:])]
pipe_w
)
args = [BASH, '-c', command, 'bass', ' '.join(sys.argv[1:])]
p = subprocess.Popen(args, universal_newlines=True, close_fds=False) p = subprocess.Popen(args, universal_newlines=True, close_fds=False)
os.close(pipe_w) os.close(pipe_w)
with os.fdopen(pipe_r) as f: with os.fdopen(pipe_r) as f:
@@ -73,9 +75,7 @@ def gen_script():
alias_str = f.read() alias_str = f.read()
if p.wait() != 0: if p.wait() != 0:
raise subprocess.CalledProcessError( raise subprocess.CalledProcessError(
returncode=p.returncode, returncode=p.returncode, cmd=" ".join(sys.argv[1:]), output=new_env + alias_str
cmd=' '.join(sys.argv[1:]),
output=new_env + alias_str
) )
new_env = new_env.strip() new_env = new_env.strip()
@@ -89,41 +89,41 @@ def gen_script():
continue continue
v1 = old_env.get(k) v1 = old_env.get(k)
if not v1: if not v1:
script_lines.append(comment('adding %s=%s' % (k, v))) script_lines.append(comment(f"adding {k}={v}"))
elif v1 != v: elif v1 != v:
script_lines.append(comment('updating %s=%s -> %s' % (k, v1, v))) script_lines.append(comment(f"updating {k}={v1} -> {v}"))
# process special variables # process special variables
if k == 'PWD': if k == "PWD":
script_lines.append('cd %s' % escape(v)) script_lines.append(f"cd {escape(v)}")
continue continue
else: else:
continue continue
if k == 'PATH': if k == "PATH": # noqa: SIM108
value = ' '.join([escape(directory) value = " ".join([escape(directory) for directory in v.split(":")])
for directory in v.split(':')])
else: else:
value = escape(v) value = escape(v)
script_lines.append('set -g -x %s %s' % (k, value)) script_lines.append(f"set -g -x {k} {value}")
for var in set(old_env.keys()) - set(new_env.keys()): for var in set(old_env.keys()) - set(new_env.keys()):
script_lines.append(comment('removing %s' % var)) script_lines.append(comment(f"removing {var}"))
script_lines.append('set -e %s' % var) script_lines.append(f"set -e {var}")
script = '\n'.join(script_lines) script = "\n".join(script_lines)
alias_lines = [] alias_lines = []
for line in alias_str.splitlines(): for line in alias_str.splitlines():
_, rest = line.split(None, 1) _, rest = line.split(None, 1)
k, v = rest.split("=", 1) k, v = rest.split("=", 1)
alias_lines.append("alias " + escape_identifier(k) + "=" + v) alias_lines.append("alias " + escape_identifier(k) + "=" + v)
alias = '\n'.join(alias_lines) alias = "\n".join(alias_lines)
return script + '\n' + alias return script + "\n" + alias
script_file = os.fdopen(3, 'w')
script_file = os.fdopen(3, "w")
if not sys.argv[1:]: if not sys.argv[1:]:
print('__bass_usage', file=script_file, end='') print("__bass_usage", file=script_file, end="")
sys.exit(0) sys.exit(0)
try: try:
@@ -131,7 +131,7 @@ try:
except subprocess.CalledProcessError as e: except subprocess.CalledProcessError as e:
sys.exit(e.returncode) sys.exit(e.returncode)
except Exception: except Exception:
print('Bass internal error!', file=sys.stderr) print("Bass internal error!", file=sys.stderr)
raise # traceback will output to stderr raise # traceback will output to stderr
except KeyboardInterrupt: except KeyboardInterrupt:
signal.signal(signal.SIGINT, signal.SIG_DFL) signal.signal(signal.SIGINT, signal.SIG_DFL)

View File

@@ -58,4 +58,3 @@ fish_pager_color_progress 737994
fish_pager_color_prefix f4b8e4 fish_pager_color_prefix f4b8e4
fish_pager_color_completion c6d0f5 fish_pager_color_completion c6d0f5
fish_pager_color_description 737994 fish_pager_color_description 737994

View File

@@ -58,4 +58,3 @@ fish_pager_color_progress 6e738d
fish_pager_color_prefix f5bde6 fish_pager_color_prefix f5bde6
fish_pager_color_completion cad3f5 fish_pager_color_completion cad3f5
fish_pager_color_description 6e738d fish_pager_color_description 6e738d

View File

@@ -58,4 +58,3 @@ fish_pager_color_progress 6c7086
fish_pager_color_prefix f5c2e7 fish_pager_color_prefix f5c2e7
fish_pager_color_completion cdd6f4 fish_pager_color_completion cdd6f4
fish_pager_color_description 6c7086 fish_pager_color_description 6c7086

View File

@@ -14,7 +14,8 @@ if [[ $- =~ i ]]; then
# To use custom commands instead of find, override _fzf_compgen_{path,dir} # To use custom commands instead of find, override _fzf_compgen_{path,dir}
if ! declare -f _fzf_compgen_path > /dev/null; then if ! declare -f _fzf_compgen_path > /dev/null; then
_fzf_compgen_path() { _fzf_compgen_path()
{
echo "$1" echo "$1"
command find -L "$1" \ command find -L "$1" \
-name .git -prune -o -name .hg -prune -o -name .svn -prune -o \( -type d -o -type f -o -type l \) \ -name .git -prune -o -name .hg -prune -o -name .svn -prune -o \( -type d -o -type f -o -type l \) \
@@ -23,7 +24,8 @@ if [[ $- =~ i ]]; then
fi fi
if ! declare -f _fzf_compgen_dir > /dev/null; then if ! declare -f _fzf_compgen_dir > /dev/null; then
_fzf_compgen_dir() { _fzf_compgen_dir()
{
command find -L "$1" \ command find -L "$1" \
-name .git -prune -o -name .hg -prune -o -name .svn -prune -o -type d \ -name .git -prune -o -name .hg -prune -o -name .svn -prune -o -type d \
-a -not -path "$1" -print 2> /dev/null | sed 's@^\./@@' -a -not -path "$1" -print 2> /dev/null | sed 's@^\./@@'
@@ -35,10 +37,13 @@ if [[ $- =~ i ]]; then
# To redraw line after fzf closes (printf '\e[5n') # To redraw line after fzf closes (printf '\e[5n')
bind '"\e[0n": redraw-current-line' 2> /dev/null bind '"\e[0n": redraw-current-line' 2> /dev/null
__fzf_comprun() { __fzf_comprun()
{
if [[ "$(type -t _fzf_comprun 2>&1)" = function ]]; then if [[ "$(type -t _fzf_comprun 2>&1)" = function ]]; then
_fzf_comprun "$@" _fzf_comprun "$@"
elif [[ -n "${TMUX_PANE-}" ]] && { [[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]; }; then elif [[ -n "${TMUX_PANE-}" ]] && {
[[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]
}; then
shift shift
fzf-tmux ${FZF_TMUX_OPTS:--d${FZF_TMUX_HEIGHT:-40%}} -- "$@" fzf-tmux ${FZF_TMUX_OPTS:--d${FZF_TMUX_HEIGHT:-40%}} -- "$@"
else else
@@ -47,7 +52,8 @@ if [[ $- =~ i ]]; then
fi fi
} }
__fzf_orig_completion() { __fzf_orig_completion()
{
local l comp f cmd local l comp f cmd
while read -r l; do while read -r l; do
if [[ "$l" =~ ^(.*\ -F)\ *([^ ]*).*\ ([^ ]*)$ ]]; then if [[ "$l" =~ ^(.*\ -F)\ *([^ ]*).*\ ([^ ]*)$ ]]; then
@@ -63,7 +69,8 @@ if [[ $- =~ i ]]; then
done done
} }
_fzf_opts_completion() { _fzf_opts_completion()
{
local cur prev opts local cur prev opts
COMPREPLY=() COMPREPLY=()
cur="${COMP_WORDS[COMP_CWORD]}" cur="${COMP_WORDS[COMP_CWORD]}"
@@ -134,7 +141,8 @@ if [[ $- =~ i ]]; then
return 0 return 0
} }
_fzf_handle_dynamic_completion() { _fzf_handle_dynamic_completion()
{
local cmd orig_var orig ret orig_cmd orig_complete local cmd orig_var orig ret orig_cmd orig_complete
cmd="$1" cmd="$1"
shift shift
@@ -161,7 +169,8 @@ if [[ $- =~ i ]]; then
fi fi
} }
__fzf_generic_path_completion() { __fzf_generic_path_completion()
{
local cur base dir leftover matches trigger cmd local cur base dir leftover matches trigger cmd
cmd="${COMP_WORDS[0]}" cmd="${COMP_WORDS[0]}"
if [[ $cmd == \\* ]]; then if [[ $cmd == \\* ]]; then
@@ -207,7 +216,8 @@ if [[ $- =~ i ]]; then
fi fi
} }
_fzf_complete() { _fzf_complete()
{
# Split arguments around -- # Split arguments around --
local args rest str_arg i sep local args rest str_arg i sep
args=("$@") args=("$@")
@@ -253,50 +263,59 @@ if [[ $- =~ i ]]; then
fi fi
} }
_fzf_path_completion() { _fzf_path_completion()
{
__fzf_generic_path_completion _fzf_compgen_path "-m" "" "$@" __fzf_generic_path_completion _fzf_compgen_path "-m" "" "$@"
} }
# Deprecated. No file only completion. # Deprecated. No file only completion.
_fzf_file_completion() { _fzf_file_completion()
{
_fzf_path_completion "$@" _fzf_path_completion "$@"
} }
_fzf_dir_completion() { _fzf_dir_completion()
{
__fzf_generic_path_completion _fzf_compgen_dir "" "/" "$@" __fzf_generic_path_completion _fzf_compgen_dir "" "/" "$@"
} }
_fzf_complete_kill() { _fzf_complete_kill()
{
_fzf_proc_completion "$@" _fzf_proc_completion "$@"
} }
_fzf_proc_completion() { _fzf_proc_completion()
{
_fzf_complete -m --header-lines=1 --preview 'echo {}' --preview-window down:3:wrap --min-height 15 -- "$@" < <( _fzf_complete -m --header-lines=1 --preview 'echo {}' --preview-window down:3:wrap --min-height 15 -- "$@" < <(
command ps -eo user,pid,ppid,start,time,command 2>/dev/null || command ps -eo user,pid,ppid,start,time,command 2> /dev/null \
command ps -eo user,pid,ppid,time,args # For BusyBox || command ps -eo user,pid,ppid,time,args # For BusyBox
) )
} }
_fzf_proc_completion_post() { _fzf_proc_completion_post()
{
awk '{print $2}' awk '{print $2}'
} }
_fzf_host_completion() { _fzf_host_completion()
{
_fzf_complete +m -- "$@" < <( _fzf_complete +m -- "$@" < <(
command cat <(command tail -n +1 ~/.ssh/config ~/.ssh/config.d/* /etc/ssh/ssh_config 2> /dev/null | command grep -i '^\s*host\(name\)\? ' | awk '{for (i = 2; i <= NF; i++) print $1 " " $i}' | command grep -v '[*?%]') \ command cat <(command tail -n +1 ~/.ssh/config ~/.ssh/config.d/* /etc/ssh/ssh_config 2> /dev/null | command grep -i '^\s*host\(name\)\? ' | awk '{for (i = 2; i <= NF; i++) print $1 " " $i}' | command grep -v '[*?%]') \
<(command grep -oE '^[[a-z0-9.,:-]+' ~/.ssh/known_hosts | tr ',' '\n' | tr -d '[' | awk '{ print $1 " " $1 }') \ <(command grep -oE '^[[a-z0-9.,:-]+' ~/.ssh/known_hosts | tr ',' '\n' | tr -d '[' | awk '{ print $1 " " $1 }') \
<(command grep -v '^\s*\(#\|$\)' /etc/hosts | command grep -Fv '0.0.0.0') | <(command grep -v '^\s*\(#\|$\)' /etc/hosts | command grep -Fv '0.0.0.0') \
awk '{if (length($2) > 0) {print $2}}' | sort -u | awk '{if (length($2) > 0) {print $2}}' | sort -u
) )
} }
_fzf_var_completion() { _fzf_var_completion()
{
_fzf_complete -m -- "$@" < <( _fzf_complete -m -- "$@" < <(
declare -xp | sed -En 's|^declare [^ ]+ ([^=]+).*|\1|p' declare -xp | sed -En 's|^declare [^ ]+ ([^=]+).*|\1|p'
) )
} }
_fzf_alias_completion() { _fzf_alias_completion()
{
_fzf_complete -m -- "$@" < <( _fzf_complete -m -- "$@" < <(
alias | sed -En 's|^alias ([^=]+).*|\1|p' alias | sed -En 's|^alias ([^=]+).*|\1|p'
) )
@@ -327,7 +346,8 @@ if [[ $- =~ i ]]; then
_fzf_completion_loader=1 _fzf_completion_loader=1
fi fi
__fzf_defc() { __fzf_defc()
{
local cmd func opts orig_var orig def local cmd func opts orig_var orig def
cmd="$1" cmd="$1"
func="$2" func="$2"
@@ -354,7 +374,8 @@ if [[ $- =~ i ]]; then
unset cmd d_cmds a_cmds unset cmd d_cmds a_cmds
_fzf_setup_completion() { _fzf_setup_completion()
{
local kind fn cmd local kind fn cmd
kind=$1 kind=$1
fn=_fzf_${1}_completion fn=_fzf_${1}_completion

View File

@@ -13,7 +13,8 @@
# Key bindings # Key bindings
# ------------ # ------------
__fzf_select__() { __fzf_select__()
{
local cmd opts local cmd opts
cmd="${FZF_CTRL_T_COMMAND:-"command find -L . -mindepth 1 \\( -path '*/\\.*' -o -fstype 'sysfs' -o -fstype 'devfs' -o -fstype 'devtmpfs' -o -fstype 'proc' \\) -prune \ cmd="${FZF_CTRL_T_COMMAND:-"command find -L . -mindepth 1 \\( -path '*/\\.*' -o -fstype 'sysfs' -o -fstype 'devfs' -o -fstype 'devtmpfs' -o -fstype 'proc' \\) -prune \
-o -type f -print \ -o -type f -print \
@@ -21,27 +22,32 @@ __fzf_select__() {
-o -type l -print 2> /dev/null | cut -b3-"}" -o -type l -print 2> /dev/null | cut -b3-"}"
opts="--height ${FZF_TMUX_HEIGHT:-40%} --bind=ctrl-z:ignore --reverse ${FZF_DEFAULT_OPTS-} ${FZF_CTRL_T_OPTS-} -m" opts="--height ${FZF_TMUX_HEIGHT:-40%} --bind=ctrl-z:ignore --reverse ${FZF_DEFAULT_OPTS-} ${FZF_CTRL_T_OPTS-} -m"
# shellcheck disable=SC2091 # Intentionally execute output of __fzfcmd # shellcheck disable=SC2091 # Intentionally execute output of __fzfcmd
eval "$cmd" | FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) "$@" | eval "$cmd" | FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) "$@" \
while read -r item; do | while read -r item; do
printf '%q ' "$item" # escape special chars printf '%q ' "$item" # escape special chars
done done
} }
if [[ $- =~ i ]]; then if [[ $- =~ i ]]; then
__fzfcmd() { __fzfcmd()
[[ -n "${TMUX_PANE-}" ]] && { [[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]; } && {
echo "fzf-tmux ${FZF_TMUX_OPTS:--d${FZF_TMUX_HEIGHT:-40%}} -- " || echo "fzf" [[ -n "${TMUX_PANE-}" ]] && {
[[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]
} \
&& echo "fzf-tmux ${FZF_TMUX_OPTS:--d${FZF_TMUX_HEIGHT:-40%}} -- " || echo "fzf"
} }
fzf-file-widget() { fzf-file-widget()
{
local selected local selected
selected="$(__fzf_select__ "$@")" selected="$(__fzf_select__ "$@")"
READLINE_LINE="${READLINE_LINE:0:$READLINE_POINT}$selected${READLINE_LINE:$READLINE_POINT}" READLINE_LINE="${READLINE_LINE:0:$READLINE_POINT}$selected${READLINE_LINE:$READLINE_POINT}"
READLINE_POINT=$((READLINE_POINT + ${#selected})) READLINE_POINT=$((READLINE_POINT + ${#selected}))
} }
__fzf_cd__() { __fzf_cd__()
{
local cmd opts dir local cmd opts dir
cmd="${FZF_ALT_C_COMMAND:-"command find -L . -mindepth 1 \\( -path '*/\\.*' -o -fstype 'sysfs' -o -fstype 'devfs' -o -fstype 'devtmpfs' -o -fstype 'proc' \\) -prune \ cmd="${FZF_ALT_C_COMMAND:-"command find -L . -mindepth 1 \\( -path '*/\\.*' -o -fstype 'sysfs' -o -fstype 'devfs' -o -fstype 'devtmpfs' -o -fstype 'proc' \\) -prune \
-o -type d -print 2> /dev/null | cut -b3-"}" -o -type d -print 2> /dev/null | cut -b3-"}"
@@ -53,16 +59,17 @@ if [[ $- =~ i ]]; then
) && printf 'builtin cd -- %q' "$dir" ) && printf 'builtin cd -- %q' "$dir"
} }
__fzf_history__() { __fzf_history__()
{
local output opts script local output opts script
opts="--height ${FZF_TMUX_HEIGHT:-40%} --bind=ctrl-z:ignore ${FZF_DEFAULT_OPTS-} -n2..,.. --scheme=history --bind=ctrl-r:toggle-sort ${FZF_CTRL_R_OPTS-} +m --read0" opts="--height ${FZF_TMUX_HEIGHT:-40%} --bind=ctrl-z:ignore ${FZF_DEFAULT_OPTS-} -n2..,.. --scheme=history --bind=ctrl-r:toggle-sort ${FZF_CTRL_R_OPTS-} +m --read0"
script='BEGIN { getc; $/ = "\n\t"; $HISTCOUNT = $ENV{last_hist} + 1 } s/^[ *]//; print $HISTCOUNT - $. . "\t$_" if !$seen{$_}++' script='BEGIN { getc; $/ = "\n\t"; $HISTCOUNT = $ENV{last_hist} + 1 } s/^[ *]//; print $HISTCOUNT - $. . "\t$_" if !$seen{$_}++'
# shellcheck disable=SC2091 # Intentionally execute output of __fzfcmd # shellcheck disable=SC2091 # Intentionally execute output of __fzfcmd
output=$( output=$(
set +o pipefail set +o pipefail
builtin fc -lnr -2147483648 | builtin fc -lnr -2147483648 \
last_hist=$(HISTTIMEFORMAT='' builtin history 1) perl -n -l0 -e "$script" | | last_hist=$(HISTTIMEFORMAT='' builtin history 1) perl -n -l0 -e "$script" \
FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) --query "$READLINE_LINE" | FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) --query "$READLINE_LINE"
) || return ) || return
READLINE_LINE=${output#*$'\t'} READLINE_LINE=${output#*$'\t'}
if [[ -z "$READLINE_POINT" ]]; then if [[ -z "$READLINE_POINT" ]]; then

View File

@@ -52,4 +52,4 @@ keybindings:
prs: [] prs: []
repoPaths: {} repoPaths: {}
pager: pager:
diff: '' diff: ""

View File

@@ -1,3 +1,3 @@
--- ---
git_protocol: https git_protocol: https
version: '1' version: "1"

View File

@@ -1,4 +1,4 @@
#!/bin/env bash #!/usr/bin/env bash
[ -z "$NVM_DIR" ] && export NVM_DIR="$HOME/.config/nvm" [[ -z "$NVM_DIR" ]] && export NVM_DIR="$HOME/.config/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm [[ -s "$NVM_DIR/nvm.sh" ]] && \. "$NVM_DIR/nvm.sh" # This loads nvm

View File

@@ -5,7 +5,7 @@
# shellcheck shell=bash # shellcheck shell=bash
# Defaults # Defaults
[ -z "$DOTFILES" ] && export DOTFILES="$HOME/.dotfiles" [[ -z "${DOTFILES:-}" ]] && export DOTFILES="$HOME/.dotfiles"
DOTFILES_CURRENT_SHELL=$(basename "$SHELL") DOTFILES_CURRENT_SHELL=$(basename "$SHELL")
export DOTFILES_CURRENT_SHELL export DOTFILES_CURRENT_SHELL
@@ -15,7 +15,7 @@ VERBOSE="${VERBOSE:-0}"
DEBUG="${DEBUG:-0}" DEBUG="${DEBUG:-0}"
# Enable debugging with DEBUG=1 # Enable debugging with DEBUG=1
[ "${DEBUG:-0}" -eq 1 ] && set -x [[ "${DEBUG:-0}" -eq 1 ]] && set -x
# Detect the current shell # Detect the current shell
CURRENT_SHELL=$(ps -p $$ -ocomm= | awk -F/ '{print $NF}') CURRENT_SHELL=$(ps -p $$ -ocomm= | awk -F/ '{print $NF}')
@@ -33,9 +33,10 @@ x-path-prepend()
;; ;;
*) *)
echo "Unsupported shell: $CURRENT_SHELL" echo "Unsupported shell: $CURRENT_SHELL"
exit 1 return 1
;; ;;
esac esac
return 0
} }
# Function to set environment variables based on the shell # Function to set environment variables based on the shell
@@ -52,9 +53,10 @@ x-set-env()
;; ;;
*) *)
echo "Unsupported shell: $CURRENT_SHELL" echo "Unsupported shell: $CURRENT_SHELL"
exit 1 return 1
;; ;;
esac esac
return 0
} }
# Explicitly set XDG folders, if not already set # Explicitly set XDG folders, if not already set
@@ -74,16 +76,17 @@ x-path-prepend "$DOTFILES/local/bin"
x-path-prepend "$XDG_BIN_HOME" x-path-prepend "$XDG_BIN_HOME"
# Custom completion paths # Custom completion paths
[ -z "$ZSH_CUSTOM_COMPLETION_PATH" ] && export ZSH_CUSTOM_COMPLETION_PATH="$XDG_CONFIG_HOME/zsh/completion" [[ -z "${ZSH_CUSTOM_COMPLETION_PATH:-}" ]] && export ZSH_CUSTOM_COMPLETION_PATH="$XDG_CONFIG_HOME/zsh/completion"
x-dc "$ZSH_CUSTOM_COMPLETION_PATH" x-dc "$ZSH_CUSTOM_COMPLETION_PATH"
export FPATH="$ZSH_CUSTOM_COMPLETION_PATH:$FPATH" export FPATH="$ZSH_CUSTOM_COMPLETION_PATH:${FPATH:-}"
if ! declare -f msg > /dev/null; then if ! declare -f msg > /dev/null; then
# Function to print messages if VERBOSE is enabled # Function to print messages if VERBOSE is enabled
# $1 - message (string) # $1 - message (string)
msg() msg()
{ {
[ "$VERBOSE" -eq 1 ] && msgr msg "$1" local message="$1"
[[ "$VERBOSE" -eq 1 ]] && msgr msg "$message"
return 0 return 0
} }
msg "msg was not defined, defined it now" msg "msg was not defined, defined it now"
@@ -95,7 +98,8 @@ if ! declare -f msg_err > /dev/null; then
# $1 - error message (string) # $1 - error message (string)
msg_err() msg_err()
{ {
msgr err "$1" >&2 local message="$1"
msgr err "$message" >&2
exit 1 exit 1
} }
fi fi
@@ -106,7 +110,8 @@ if ! declare -f msg_done > /dev/null; then
# $1 - message (string) # $1 - message (string)
msg_done() msg_done()
{ {
msgr "done" "$1" local message="$1"
msgr "done" "$message"
return 0 return 0
} }
fi fi
@@ -117,7 +122,8 @@ if ! declare -f msg_run > /dev/null; then
# $1 - message (string) # $1 - message (string)
msg_run() msg_run()
{ {
msgr run "$1" local message="$1"
msgr run "$message"
return 0 return 0
} }
fi fi
@@ -128,7 +134,8 @@ if ! declare -f msg_ok > /dev/null; then
# $1 - message (string) # $1 - message (string)
msg_ok() msg_ok()
{ {
msgr ok "$1" local message="$1"
msgr ok "$message"
return 0 return 0
} }
fi fi
@@ -143,12 +150,16 @@ if ! declare -f array_diff > /dev/null; then
# Source: https://stackoverflow.com/a/42399479/594940 # Source: https://stackoverflow.com/a/42399479/594940
array_diff() array_diff()
{ {
local result_var="$1"
local arr1_name="$2"
local arr2_name="$3"
# shellcheck disable=SC1083,SC2086 # shellcheck disable=SC1083,SC2086
eval local ARR1=\(\"\${$2[@]}\"\) eval local ARR1=\(\"\${${arr1_name}[@]}\"\)
# shellcheck disable=SC1083,SC2086 # shellcheck disable=SC1083,SC2086
eval local ARR2=\(\"\${$3[@]}\"\) eval local ARR2=\(\"\${${arr2_name}[@]}\"\)
local IFS=$'\n' local IFS=$'\n'
mapfile -t "$1" < <(comm -23 <(echo "${ARR1[*]}" | sort) <(echo "${ARR2[*]}" | sort)) mapfile -t "$result_var" < <(comm -23 <(echo "${ARR1[*]}" | sort) <(echo "${ARR2[*]}" | sort))
return 0
} }
fi fi

View File

@@ -7,13 +7,13 @@ DEFAULT_NAME="main"
CURRENT_SESSION=$(tmux display-message -p "#{session_name}") CURRENT_SESSION=$(tmux display-message -p "#{session_name}")
# Check that the session has a name # Check that the session has a name
if [ "$CURRENT_SESSION" = "#{session_name}" ] || [ "$CURRENT_SESSION" = "0" ]; then if [[ "$CURRENT_SESSION" = "#{session_name}" ]] || [[ "$CURRENT_SESSION" = "0" ]]; then
# Check if the default name is already in use # Check if the default name is already in use
if tmux has-session -t "$DEFAULT_NAME" 2> /dev/null; then if tmux has-session -t "$DEFAULT_NAME" 2> /dev/null; then
# Query the user for a new name # Query the user for a new name
echo "Session name '$DEFAULT_NAME' is already in use. Enter a new name:" echo "Session name '$DEFAULT_NAME' is already in use. Enter a new name:"
read -r NEW_NAME read -r NEW_NAME
while tmux has-session -t "$NEW_NAME" 2> /dev/null || [ -z "$NEW_NAME" ]; do while tmux has-session -t "$NEW_NAME" 2> /dev/null || [[ -z "$NEW_NAME" ]]; do
echo "Name '$NEW_NAME' is invalid or already in use. Enter a new name:" echo "Name '$NEW_NAME' is invalid or already in use. Enter a new name:"
read -r NEW_NAME read -r NEW_NAME
done done

View File

@@ -13,7 +13,9 @@ if ! command -v sesh &>/dev/null; then
exit 0 exit 0
fi fi
pick_with_gum() { # Pick a sesh session using gum filter
pick_with_gum()
{
sesh list -i \ sesh list -i \
| gum filter \ | gum filter \
--limit 1 \ --limit 1 \
@@ -22,6 +24,7 @@ pick_with_gum() {
--placeholder 'Pick a sesh' \ --placeholder 'Pick a sesh' \
--height 50 \ --height 50 \
--prompt='⚡' --prompt='⚡'
return 0
} }
FZF_COMMON_OPTS=( FZF_COMMON_OPTS=(
@@ -40,15 +43,23 @@ FZF_COMMON_OPTS=(
--preview 'sesh preview {}' --preview 'sesh preview {}'
) )
pick_with_fzf_tmux() { # Pick a sesh session using fzf-tmux popup
pick_with_fzf_tmux()
{
sesh list --icons | fzf-tmux -p 80%,70% "${FZF_COMMON_OPTS[@]}" sesh list --icons | fzf-tmux -p 80%,70% "${FZF_COMMON_OPTS[@]}"
return 0
} }
pick_with_fzf() { # Pick a sesh session using fzf inline
pick_with_fzf()
{
sesh list --icons | fzf "${FZF_COMMON_OPTS[@]}" sesh list --icons | fzf "${FZF_COMMON_OPTS[@]}"
return 0
} }
pick_with_select() { # Pick a sesh session using bash select menu
pick_with_select()
{
local sessions local sessions
mapfile -t sessions < <(sesh list) mapfile -t sessions < <(sesh list)
if [[ ${#sessions[@]} -eq 0 ]]; then if [[ ${#sessions[@]} -eq 0 ]]; then

View File

@@ -0,0 +1,40 @@
# Skip Already-Installed Cargo Packages
## Problem
`install-cargo-packages.sh` runs `cargo install-update -a` to update installed
packages, then runs `cargo install` for every package in the list — including
ones that are already installed and up-to-date. This wastes time rebuilding
packages that don't need it.
## Solution
Capture the `cargo install-update -a` output, parse installed package names,
and skip `cargo install` for any package that appeared in the update output.
## Changes
**File:** `scripts/install-cargo-packages.sh`
1. Declare an associative array `installed_packages` at the top.
2. In the `cargo-install-update` section, capture output with `tee /dev/stderr`
so it displays in real-time while also being stored in a variable.
3. Parse the captured output with `awk` — extract the first column from lines
matching a version pattern (`v[0-9]+\.[0-9]+`), skipping the header.
4. Populate `installed_packages` associative array from parsed names.
5. In `install_packages()`, check each package against the array. If found, log
a skip message via `msgr` and continue. If not found, install as before.
6. If `cargo-install-update` is not available, the array stays empty and all
packages install normally (preserves existing behavior).
## Output Parsing
The `cargo install-update -a` output format:
```text
Package Installed Latest Needs update
zoxide v0.9.8 v0.9.9 Yes
bkt v0.8.2 v0.8.2 No
```
Extraction: `awk '/v[0-9]+\.[0-9]+/ { print $1 }'` gets package names.

View File

@@ -0,0 +1,55 @@
# dfm Cleanup Design
## Summary
Clean up `local/bin/dfm` to fix bugs, remove dead code, improve
cross-platform portability, and make error handling consistent.
## Changes
### 1. Bash Version Bootstrap
Add a check at the top of the script (after variable declarations)
that requires bash 4.0+. On macOS, if bash is too old, install
Homebrew (if missing) and bash, then print instructions and exit.
The check itself uses only bash 3.2-compatible syntax.
### 2. Remove Fish Dead Code
Remove `CURRENT_SHELL` detection, `source_file()` function, and all
fish branches. Replace `source_file` calls with direct `source`.
The script has a bash shebang — fish handling was unreachable.
### 3. Bug Fixes
- Remove `ntfy` from install menu (no install script exists)
- Fix `msg)``msgr)` case label in `section_tests`
- Guard all `shift` calls against empty argument lists
- Quote `$width` in `menu_builder` seq calls
- Fix `$"..."` locale string → `"..."` in `usage()`
- Fix `exit 0` on apt.txt error → `return 1`
### 4. Replace `declare -A` in `section_scripts`
Replace associative array with indexed `"name:desc"` array,
matching the pattern used everywhere else in the script.
Move `get_script_description()` to top-level (out of the function).
### 5. Early-Return Guards & exit → return
- `section_brew()`: Early return with `msgr warn` if brew unavailable.
Remove duplicate `! x-have brew` check.
- `section_apt()`: Same pattern for apt.
- `section_check()`: Replace `exit` with `return`.
- `section_apt() install`: Replace `exit` with `return`.
- `section_brew() untracked`: Replace `exit` with `return`.
## Files Changed
- `local/bin/dfm` (all changes)
## Verification
- `yarn test` (existing bats test)
- `shellcheck local/bin/dfm`
- `bash -n local/bin/dfm` (syntax check)

View File

@@ -0,0 +1,46 @@
# x-* Scripts Cleanup Design
## Summary
Comprehensive cleanup of all 34 x-* utility scripts in `local/bin/`.
Fix critical bugs, consolidate duplicates, standardize patterns.
## Changes
### Removals
- `x-mkd`, `x-mkd.md`, `tests/x-mkd.bats` — unused, cd-in-subshell broken
- `x-validate-sha256sum.sh`, `x-validate-sha256sum.sh.md` — duplicates x-sha256sum-matcher
### Thin Wrappers (delegate to x-path)
- `x-path-append` → calls `x-path append "$@"`
- `x-path-prepend` → calls `x-path prepend "$@"`
- `x-path-remove` → calls `x-path remove "$@"`
### Critical Fixes
- `x-clean-vendordirs`: call msgr as command (it's in PATH)
- `x-foreach`: replace eval with direct "$@" execution
- `x-ip`: add error handling, curl check
### Consistency Fixes
- Fix `#!/bin/bash``#!/usr/bin/env bash` (x-env-list, x-localip)
- POSIX scripts keep `#!/bin/sh`
- Add `set -euo pipefail` where missing in bash scripts
- Use XDG variables instead of hardcoded paths (x-change-alacritty-theme)
- Quote unquoted variables
### Minor Fixes
- `x-multi-ping`: remove unused VERBOSE variable
- `x-when-down`, `x-when-up`: add error handling
- `x-term-colors`: add usage message
- `x-record`: fix undefined notify-call reference
## Verification
- `yarn test` — ensure remaining tests pass
- `shellcheck` on modified scripts
- `bash -n` syntax check on all modified bash scripts

View File

@@ -1,5 +1,5 @@
--- ---
- include: 'tools/dotbot-defaults.yaml' - include: "tools/dotbot-defaults.yaml"
- shell: - shell:
- echo "Configuring air" - echo "Configuring air"
- link: - link:
@@ -7,7 +7,7 @@
force: true force: true
glob: true glob: true
path: hosts/air/base/** path: hosts/air/base/**
prefix: '.' prefix: "."
~/.config/: ~/.config/:
glob: true glob: true
force: true force: true

View File

@@ -1,5 +1,5 @@
--- ---
- include: 'tools/dotbot-defaults.yaml' - include: "tools/dotbot-defaults.yaml"
- shell: - shell:
- echo "Configuring lakka" - echo "Configuring lakka"
- link: - link:
@@ -7,7 +7,7 @@
force: true force: true
glob: true glob: true
path: hosts/lakka/base/** path: hosts/lakka/base/**
prefix: '.' prefix: "."
~/.config/: ~/.config/:
glob: true glob: true
force: true force: true

View File

@@ -1,5 +1,5 @@
--- ---
- include: 'tools/dotbot-defaults.yaml' - include: "tools/dotbot-defaults.yaml"
- shell: - shell:
- echo "Configuring s" - echo "Configuring s"
- link: - link:
@@ -7,7 +7,7 @@
force: true force: true
glob: true glob: true
path: hosts/s/base/** path: hosts/s/base/**
prefix: '.' prefix: "."
~/.config/: ~/.config/:
glob: true glob: true
force: true force: true

View File

@@ -1,5 +1,5 @@
--- ---
- include: 'tools/dotbot-defaults.yaml' - include: "tools/dotbot-defaults.yaml"
- shell: - shell:
- echo "Configuring tunkki" - echo "Configuring tunkki"
- link: - link:
@@ -7,7 +7,7 @@
force: true force: true
glob: true glob: true
path: hosts/tunkki/base/** path: hosts/tunkki/base/**
prefix: '.' prefix: "."
~/.config/: ~/.config/:
glob: true glob: true
force: true force: true

View File

@@ -22,9 +22,9 @@ git submodule update --init --recursive "${DOTBOT_DIR}"
if [ "${DOTBOT_HOST}" != "" ]; then if [ "${DOTBOT_HOST}" != "" ]; then
DOTBOT_HOST_CONFIG="${BASEDIR}/hosts/${DOTBOT_HOST}/${CONFIG}" DOTBOT_HOST_CONFIG="${BASEDIR}/hosts/${DOTBOT_HOST}/${CONFIG}"
echo "-> Trying if host config can be found: ${DOTBOT_HOST_CONFIG}" echo "-> Trying if host config can be found: ${DOTBOT_HOST_CONFIG}"
[ -r "$DOTBOT_HOST_CONFIG" ] && [ -f "$DOTBOT_HOST_CONFIG" ] && [ -r "$DOTBOT_HOST_CONFIG" ] && [ -f "$DOTBOT_HOST_CONFIG" ] \
echo "(!) Found $DOTBOT_HOST_CONFIG" && && echo "(!) Found $DOTBOT_HOST_CONFIG" \
"$DOTBOT_BIN_PATH" \ && "$DOTBOT_BIN_PATH" \
-d "$BASEDIR" \ -d "$BASEDIR" \
--plugin-dir=tools/dotbot-include \ --plugin-dir=tools/dotbot-include \
-c "$DOTBOT_HOST_CONFIG" \ -c "$DOTBOT_HOST_CONFIG" \

View File

@@ -1,5 +1,5 @@
--- ---
- include: 'tools/dotbot-defaults.yaml' - include: "tools/dotbot-defaults.yaml"
- clean: - clean:
~/: ~/:
@@ -34,7 +34,7 @@
force: true force: true
glob: true glob: true
path: base/* path: base/*
prefix: '.' prefix: "."
# Most of the configs # Most of the configs
~/.config/: ~/.config/:
glob: true glob: true

View File

@@ -20,7 +20,7 @@ Some problematic code has been fixed per `shellcheck` suggestions.
## Sourced ## Sourced
| Script | Source | | Script | Source |
| ----------------------- | ----------------- | |-------------------------|-------------------|
| `x-dupes` | skx/sysadmin-util | | `x-dupes` | skx/sysadmin-util |
| `x-foreach` | mvdan/dotfiles | | `x-foreach` | mvdan/dotfiles |
| `x-multi-ping` | skx/sysadmin-util | | `x-multi-ping` | skx/sysadmin-util |

View File

@@ -1,7 +1,9 @@
#!/usr/bin/env bash #!/usr/bin/env bash
# A script for encrypting and decrypting files or directories with age and SSH keys # A script for encrypting and decrypting files or directories with age and SSH keys
VERSION="1.0.0" set -euo pipefail
VERSION="1.1.0"
# Default ENV values # Default ENV values
KEYS_FILE="${AGE_KEYSFILE:-$HOME/.ssh/keys.txt}" KEYS_FILE="${AGE_KEYSFILE:-$HOME/.ssh/keys.txt}"
@@ -9,14 +11,49 @@ KEYS_SOURCE="${AGE_KEYSSOURCE:-https://github.com/ivuorinen.keys}"
LOG_FILE="${AGE_LOGFILE:-$HOME/.cache/a.log}" LOG_FILE="${AGE_LOGFILE:-$HOME/.cache/a.log}"
VERBOSE=false VERBOSE=false
DELETE_ORIGINAL=false
FORCE=false
# Parse flags for verbosity # Check for required dependencies
for arg in "$@"; do check_dependencies()
if [[ "$arg" == "-v" || "$arg" == "--verbose" ]]; then {
VERBOSE=true if ! command -v age &> /dev/null; then
break echo "Error: 'age' is not installed. Please install it first." >&2
echo " brew install age # macOS" >&2
echo " apt install age # Debian/Ubuntu" >&2
echo " dnf install age # Fedora" >&2
exit 1
fi fi
if ! command -v curl &> /dev/null; then
echo "Error: 'curl' is not installed." >&2
exit 1
fi
}
# Parse flags
parse_flags()
{
local args=()
for arg in "$@"; do
case "$arg" in
-v | --verbose)
VERBOSE=true
;;
--delete)
DELETE_ORIGINAL=true
;;
-f | --force)
FORCE=true
;;
*)
args+=("$arg")
;;
esac
done done
# Return remaining arguments
printf '%s\n' "${args[@]}"
}
# Ensure log directory and file exist with correct permissions # Ensure log directory and file exist with correct permissions
prepare_log_file() prepare_log_file()
@@ -38,8 +75,6 @@ prepare_log_file()
chmod 0600 "$LOG_FILE" chmod 0600 "$LOG_FILE"
} }
prepare_log_file
# Logging function # Logging function
log_message() log_message()
{ {
@@ -56,7 +91,7 @@ log_message()
print_help() print_help()
{ {
cat << EOF cat << EOF
Usage: a [command] [file_or_directory] [options] Usage: a [options] [command] [file_or_directory]
Commands: Commands:
e, enc, encrypt Encrypt the specified file or directory e, enc, encrypt Encrypt the specified file or directory
@@ -65,12 +100,14 @@ Commands:
version, --version Show version information version, --version Show version information
Options: Options:
-v, --verbose Print log messages to console in addition to writing to log file -v, --verbose Print log messages to console
--delete Delete original files after successful encryption
-f, --force Overwrite existing output files without prompting
Environment Variables: Environment Variables:
AGE_KEYSFILE Path to the SSH keys file (default: $HOME/.ssh/keys.txt) AGE_KEYSFILE Path to the SSH keys file (default: \$HOME/.ssh/keys.txt)
AGE_KEYSSOURCE URL to fetch SSH keys if keys file does not exist AGE_KEYSSOURCE URL to fetch SSH keys if keys file does not exist
AGE_LOGFILE Path to the log file (default: $HOME/.cache/a.log) AGE_LOGFILE Path to the log file (default: \$HOME/.cache/a.log)
Examples: Examples:
Encrypt a file: Encrypt a file:
@@ -79,14 +116,21 @@ Examples:
Encrypt a directory: Encrypt a directory:
a e /path/to/directory a e /path/to/directory
Encrypt and delete originals:
a --delete e file.txt
Decrypt a file: Decrypt a file:
a d file.txt.age a d file.txt.age
Force overwrite existing files:
a -f e file.txt
Specify a custom keys file: Specify a custom keys file:
AGE_KEYSFILE=/path/to/keys.txt a e file.txt AGE_KEYSFILE=/path/to/keys.txt a e file.txt
Specify a custom keys source and log file: Requirements:
AGE_KEYSSOURCE=https://example.com/keys.txt AGE_LOGFILE=/tmp/a.log a d file.txt.age - age (encryption tool): https://github.com/FiloSottile/age
- curl (for fetching keys)
EOF EOF
} }
@@ -115,26 +159,104 @@ fetch_keys_if_missing()
fi fi
} }
# Function to encrypt a single file
encrypt_single_file()
{
local file="$1"
# Skip already encrypted files
if [[ "$file" == *.age ]]; then
log_message "Skipping already encrypted file: $file"
return 0
fi
local output_file="${file}.age"
# Check if output file exists
if [[ -f "$output_file" && "$FORCE" != true ]]; then
log_message "Error: Output file '$output_file' already exists. Use --force to overwrite."
return 1
fi
fetch_keys_if_missing
local temp_file
temp_file="$(mktemp -p "$(dirname "$file")")"
if age -R "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File encrypted successfully: $output_file"
if [[ "$DELETE_ORIGINAL" == true ]]; then
rm -f "$file"
log_message "Original file deleted: $file"
fi
else
rm -f "$temp_file"
log_message "Error: Failed to encrypt file '$file'."
return 1
fi
}
# Function to encrypt files or directories # Function to encrypt files or directories
encrypt_file_or_directory() encrypt_file_or_directory()
{ {
local file="$1" local file="$1"
if [[ -d "$file" ]]; then if [[ -d "$file" ]]; then
for f in "$file"/*; do # Enable dotglob to include hidden files
shopt -s dotglob nullglob
local files=("$file"/*)
shopt -u dotglob nullglob
if [[ ${#files[@]} -eq 0 ]]; then
log_message "Warning: Directory '$file' is empty."
return 0
fi
for f in "${files[@]}"; do
encrypt_file_or_directory "$f" encrypt_file_or_directory "$f"
done done
elif [[ -f "$file" ]]; then elif [[ -f "$file" ]]; then
encrypt_single_file "$file"
else
log_message "Warning: '$file' is not a file or directory, skipping."
fi
}
# Function to decrypt a single file
decrypt_single_file()
{
local file="$1"
if [[ ! "$file" == *.age ]]; then
log_message "Skipping non-.age file: $file"
return 0
fi
local output_file="${file%.age}"
# Check if output file exists
if [[ -f "$output_file" && "$FORCE" != true ]]; then
log_message "Error: Output file '$output_file' already exists. Use --force to overwrite."
return 1
fi
fetch_keys_if_missing fetch_keys_if_missing
local output_file="${file}.age"
local temp_file local temp_file
temp_file="$(mktemp -p "$(dirname "$file")")" temp_file="$(mktemp -p "$(dirname "$file")")"
if age -R "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File encrypted successfully: $output_file" if age -d -i "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File decrypted successfully: $output_file"
if [[ "$DELETE_ORIGINAL" == true ]]; then
rm -f "$file"
log_message "Encrypted file deleted: $file"
fi
else else
rm -f "$temp_file" rm -f "$temp_file"
log_message "Error: Failed to encrypt file '$file'." log_message "Error: Failed to decrypt file '$file'."
exit 1 return 1
fi
fi fi
} }
@@ -142,54 +264,76 @@ encrypt_file_or_directory()
decrypt_file_or_directory() decrypt_file_or_directory()
{ {
local file="$1" local file="$1"
if [[ -d "$file" ]]; then if [[ -d "$file" ]]; then
for f in "$file"/*.age; do # Enable nullglob to handle no matches gracefully
decrypt_file_or_directory "$f" shopt -s nullglob
local files=("$file"/*.age)
shopt -u nullglob
if [[ ${#files[@]} -eq 0 ]]; then
log_message "Warning: No .age files found in directory '$file'."
return 0
fi
for f in "${files[@]}"; do
decrypt_single_file "$f"
done done
elif [[ -f "$file" ]]; then elif [[ -f "$file" ]]; then
fetch_keys_if_missing decrypt_single_file "$file"
local output_file="${file%.age}"
local temp_file
temp_file="$(mktemp -p "$(dirname "$file")")"
if age -d -i "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File decrypted successfully: $output_file"
else else
rm -f "$temp_file" log_message "Warning: '$file' is not a file or directory, skipping."
log_message "Error: Failed to decrypt file '$file'."
exit 1
fi
fi fi
} }
# Main logic # Main entry point
case "$1" in main()
{
check_dependencies
# Parse flags and get remaining arguments
mapfile -t ARGS < <(parse_flags "$@")
prepare_log_file
local command="${ARGS[0]:-}"
local target="${ARGS[1]:-}"
case "$command" in
e | enc | encrypt) e | enc | encrypt)
if [[ $# -lt 2 ]]; then if [[ -z "$target" ]]; then
log_message "Error: No file or directory specified for encryption." log_message "Error: No file or directory specified for encryption."
print_help print_help
exit 1 exit 1
fi fi
encrypt_file_or_directory "$2" encrypt_file_or_directory "$target"
;; ;;
d | dec | decrypt) d | dec | decrypt)
if [[ $# -lt 2 ]]; then if [[ -z "$target" ]]; then
log_message "Error: No file or directory specified for decryption." log_message "Error: No file or directory specified for decryption."
print_help print_help
exit 1 exit 1
fi fi
decrypt_file_or_directory "$2" decrypt_file_or_directory "$target"
;; ;;
help | --help) help | --help | -h)
print_help print_help
;; ;;
version | --version) version | --version)
print_version print_version
;; ;;
"")
print_help
exit 1
;;
*) *)
log_message "Error: Unknown command '$1'" log_message "Error: Unknown command '$command'"
print_help print_help
exit 1 exit 1
;; ;;
esac esac
}
main "$@"
# vim: ft=bash:syn=sh:ts=2:sw=2:et:ai:nowrap # vim: ft=bash:syn=sh:ts=2:sw=2:et:ai:nowrap

View File

@@ -2,28 +2,76 @@
Encrypt or decrypt files and directories using `age` and your GitHub SSH keys. Encrypt or decrypt files and directories using `age` and your GitHub SSH keys.
## Requirements
- [age](https://github.com/FiloSottile/age) - encryption tool
- curl - for fetching SSH keys
Install age:
```bash
brew install age # macOS
apt install age # Debian/Ubuntu
dnf install age # Fedora
```
## Usage ## Usage
```bash ```bash
a encrypt <file|dir> a [options] <command> <file|directory>
a decrypt <file.age|dir>
``` ```
Commands:
- `e`, `enc`, `encrypt` - encrypt files
- `d`, `dec`, `decrypt` - decrypt files
- `help`, `--help`, `-h` - show help
- `version`, `--version` - show version
Options: Options:
- `-v`, `--verbose` show log output - `-v`, `--verbose` - show log output
- `--delete` - delete original files after successful operation
- `-f`, `--force` - overwrite existing output files
Environment variables: Environment variables:
- `AGE_KEYSFILE` location of the keys file - `AGE_KEYSFILE` - location of the keys file (default: `~/.ssh/keys.txt`)
- `AGE_KEYSSOURCE` URL to fetch keys if missing - `AGE_KEYSSOURCE` - URL to fetch keys if missing (default: GitHub keys)
- `AGE_LOGFILE` log file path - `AGE_LOGFILE` - log file path (default: `~/.cache/a.log`)
## Example ## Examples
```bash ```bash
# Encrypt a file
a encrypt secret.txt a encrypt secret.txt
# Encrypt with short command
a e secret.txt
# Decrypt a file
a decrypt secret.txt.age a decrypt secret.txt.age
a d secret.txt.age
# Encrypt a directory (includes hidden files)
a e /path/to/secrets/
# Encrypt and delete originals
a --delete e secret.txt
# Force overwrite existing .age file
a -f e secret.txt
# Verbose output
a -v e secret.txt
``` ```
## Behavior
- Encrypting a directory processes all files recursively, including hidden files
- Already encrypted files (`.age`) are skipped during encryption
- Only `.age` files are processed during directory decryption
- Original files are preserved by default (use `--delete` to remove them)
- Output files are not overwritten by default (use `--force` to overwrite)
<!-- vim: set ft=markdown spell spelllang=en_us cc=80 : --> <!-- vim: set ft=markdown spell spelllang=en_us cc=80 : -->

View File

@@ -12,6 +12,7 @@
: "${DOTFILES:=$HOME/.dotfiles}" : "${DOTFILES:=$HOME/.dotfiles}"
: "${BREWFILE:=$DOTFILES/config/homebrew/Brewfile}" : "${BREWFILE:=$DOTFILES/config/homebrew/Brewfile}"
: "${HOSTFILES:=$DOTFILES/hosts}" : "${HOSTFILES:=$DOTFILES/hosts}"
export DOTFILES BREWFILE HOSTFILES
SCRIPT=$(basename "$0") SCRIPT=$(basename "$0")
@@ -64,39 +65,64 @@ menu_builder()
done done
} }
# Handle install section commands
section_install() section_install()
{ {
USAGE_PREFIX="$SCRIPT install <command>" USAGE_PREFIX="$SCRIPT install <command>"
MENU=( MENU=(
"all:Installs everything in the correct order" "all:Installs everything in the correct order"
"apt-packages:Install apt packages (Debian/Ubuntu)"
"cargo:Install rust/cargo packages" "cargo:Install rust/cargo packages"
"cheat-databases:Install cheat external cheatsheet databases" "cheat-databases:Install cheat external cheatsheet databases"
"composer:Install composer" "composer:Install composer"
"dnf-packages:Install dnf packages (Fedora/RHEL)"
"fonts:Install programming fonts" "fonts:Install programming fonts"
"gh:Install GitHub CLI Extensions" "gh:Install GitHub CLI Extensions"
"git-crypt:Install git-crypt from source"
"go:Install Go Packages" "go:Install Go Packages"
"imagick:Install ImageMagick CLI" "imagick:Install ImageMagick CLI"
"macos:Setup nice macOS defaults" "macos:Setup nice macOS defaults"
"npm-packages:Install NPM Packages" "npm-packages:Install NPM Packages"
"ntfy:Install ntfy notification tool"
"nvm-latest:Install latest lts node using nvm" "nvm-latest:Install latest lts node using nvm"
"nvm:Install Node Version Manager (nvm)" "nvm:Install Node Version Manager (nvm)"
"python-packages:Install Python packages via uv"
"xcode-cli-tools:Install Xcode CLI tools (macOS)"
"z:Install z" "z:Install z"
) )
case "$1" in case "$1" in
all) all)
msgr msg "Starting to install all and reloading configurations..." msgr msg "Starting to install all and reloading configurations..."
$0 install macos
$0 install fonts # Tier 0: Platform foundations (OS packages, build tools)
[[ "$(uname)" == "Darwin" ]] && $0 install macos
[[ "$(uname)" == "Darwin" ]] && $0 install xcode-cli-tools
command -v apt &> /dev/null && $0 install apt-packages
command -v dnf &> /dev/null && $0 install dnf-packages
# Tier 1: Package managers & fonts
$0 brew install $0 brew install
$0 install fonts
# Tier 2: Language packages (depend on runtimes from Tier 1)
$0 install cargo $0 install cargo
$0 install go $0 install go
$0 install composer $0 install composer
$0 install cheat-databases
$0 install nvm $0 install nvm
$0 install npm-packages $0 install npm-packages
$0 install python-packages
# Tier 3: Tool-dependent installers
$0 install cheat-databases
$0 install gh
$0 install git-crypt
$0 install ntfy
# Tier 4: Independent utilities
$0 install z $0 install z
msgr msg "Reloading configurations again..." msgr msg "Reloading configurations again..."
# shellcheck disable=SC1091 # shellcheck disable=SC1091
source "$DOTFILES/config/shared.sh" source "$DOTFILES/config/shared.sh"
@@ -111,7 +137,7 @@ section_install()
cheat-databases) cheat-databases)
msgr run "Installing cheat databases..." msgr run "Installing cheat databases..."
for database in "$DOTFILES"/scripts/install-cheat-*; do for database in "$DOTFILES"/scripts/install-cheat-*.sh; do
bash "$database" \ bash "$database" \
&& msgr run_done "Cheat: $database run" && msgr run_done "Cheat: $database run"
done done
@@ -183,6 +209,42 @@ section_install()
&& msgr yay "NPM Packages have been installed!" && msgr yay "NPM Packages have been installed!"
;; ;;
apt-packages)
msgr run "Installing apt packages..."
bash "$DOTFILES/scripts/install-apt-packages.sh" \
&& msgr yay "apt packages installed!"
;;
dnf-packages)
msgr run "Installing dnf packages..."
bash "$DOTFILES/scripts/install-dnf-packages.sh" \
&& msgr yay "dnf packages installed!"
;;
git-crypt)
msgr run "Installing git-crypt..."
bash "$DOTFILES/scripts/install-git-crypt.sh" \
&& msgr yay "git-crypt installed!"
;;
ntfy)
msgr run "Installing ntfy..."
bash "$DOTFILES/scripts/install-ntfy.sh" \
&& msgr yay "ntfy installed!"
;;
python-packages)
msgr run "Installing Python packages..."
bash "$DOTFILES/scripts/install-python-packages.sh" \
&& msgr yay "Python packages installed!"
;;
xcode-cli-tools)
msgr run "Installing Xcode CLI tools..."
bash "$DOTFILES/scripts/install-xcode-cli-tools.sh" \
&& msgr yay "Xcode CLI tools installed!"
;;
z) z)
msgr run "Installing z..." msgr run "Installing z..."
bash "$DOTFILES/scripts/install-z.sh" \ bash "$DOTFILES/scripts/install-z.sh" \
@@ -193,6 +255,7 @@ section_install()
esac esac
} }
# Handle Homebrew section commands
section_brew() section_brew()
{ {
USAGE_PREFIX="$SCRIPT brew <command>" USAGE_PREFIX="$SCRIPT brew <command>"
@@ -291,6 +354,7 @@ section_brew()
esac esac
} }
# Handle helper utility commands
section_helpers() section_helpers()
{ {
USAGE_PREFIX="$SCRIPT helpers <command>" USAGE_PREFIX="$SCRIPT helpers <command>"
@@ -367,6 +431,7 @@ section_helpers()
esac esac
} }
# Handle apt package manager commands
section_apt() section_apt()
{ {
USAGE_PREFIX="$SCRIPT apt <command>" USAGE_PREFIX="$SCRIPT apt <command>"
@@ -435,6 +500,7 @@ section_apt()
esac esac
} }
# Handle documentation generation commands
section_docs() section_docs()
{ {
USAGE_PREFIX="$SCRIPT docs <command>" USAGE_PREFIX="$SCRIPT docs <command>"
@@ -459,6 +525,7 @@ section_docs()
esac esac
} }
# Handle dotfiles formatting and reset commands
section_dotfiles() section_dotfiles()
{ {
USAGE_PREFIX="$SCRIPT dotfiles <command>" USAGE_PREFIX="$SCRIPT dotfiles <command>"
@@ -526,6 +593,7 @@ section_dotfiles()
esac esac
} }
# Handle system check commands (arch, hostname)
section_check() section_check()
{ {
USAGE_PREFIX="$SCRIPT check <command>" USAGE_PREFIX="$SCRIPT check <command>"
@@ -552,6 +620,7 @@ section_check()
esac esac
} }
# Handle install script execution
section_scripts() section_scripts()
{ {
USAGE_PREFIX="$SCRIPT scripts <command>" USAGE_PREFIX="$SCRIPT scripts <command>"
@@ -619,6 +688,7 @@ section_tests()
esac esac
} }
# Display main usage information for all sections
usage() usage()
{ {
echo "" echo ""
@@ -642,6 +712,7 @@ usage()
section_helpers section_helpers
} }
# Parse section argument and dispatch to handler
main() main()
{ {
SECTION="${1:-}" SECTION="${1:-}"

View File

@@ -22,32 +22,37 @@ if [ "$DEBUG" -eq 1 ]; then
set -x set -x
fi fi
# Output functions # Print an error message in red
msg_err() msg_err()
{ {
echo -e "\e[31m$*\e[0m" >&2 echo -e "\e[31m$*\e[0m" >&2
} }
# Print a success message in green
msg_success() msg_success()
{ {
echo -e "\e[32m$*\e[0m" echo -e "\e[32m$*\e[0m"
} }
# Print a warning message in yellow
msg_warn() msg_warn()
{ {
echo -e "\e[33m$*\e[0m" >&2 echo -e "\e[33m$*\e[0m" >&2
} }
# Print an info message in blue
msg_info() msg_info()
{ {
echo -e "\e[36m$*\e[0m" echo -e "\e[36m$*\e[0m"
} }
# Print a debug message when verbose mode is on
msg_debug() msg_debug()
{ {
[[ $VERBOSE -eq 1 ]] && echo -e "\e[35m$*\e[0m" [[ $VERBOSE -eq 1 ]] && echo -e "\e[35m$*\e[0m"
} }
# Display usage information and examples
show_help() show_help()
{ {
cat << EOF cat << EOF

View File

@@ -90,13 +90,14 @@ declare -A DIR_HAS_REPOS
# Record start time # Record start time
START_TIME=$(date +%s) START_TIME=$(date +%s)
# Logging functions # Log an error message
log_error() log_error()
{ {
print_color "31" "ERROR:" >&2 print_color "31" "ERROR:" >&2
echo " $*" >&2 echo " $*" >&2
} }
# Log an informational message
log_info() log_info()
{ {
if [[ $VERBOSE -eq 1 ]]; then if [[ $VERBOSE -eq 1 ]]; then
@@ -105,6 +106,7 @@ log_info()
fi fi
} }
# Log a warning message
log_warn() log_warn()
{ {
print_color "33" "WARNING:" >&2 print_color "33" "WARNING:" >&2
@@ -911,6 +913,7 @@ process_in_parallel()
echo -e "\nProcessed $total repositories in $dur (Total runtime: $runtime)" echo -e "\nProcessed $total repositories in $dur (Total runtime: $runtime)"
} }
# Check a directory for git status with progress tracking
check_directory_with_progress() check_directory_with_progress()
{ {
local dir local dir

View File

@@ -23,21 +23,25 @@ CLR_RESET="\033[0m"
# │ Color functions │ # │ Color functions │
# ╰──────────────────────────────────────────────────────────╯ # ╰──────────────────────────────────────────────────────────╯
# Wrap text in red color
function __color_red() function __color_red()
{ {
local MSG="$1" local MSG="$1"
echo -e "${CLR_RED}${MSG}${CLR_RESET}" echo -e "${CLR_RED}${MSG}${CLR_RESET}"
} }
# Wrap text in yellow color
function __color_yellow() function __color_yellow()
{ {
local MSG="$1" local MSG="$1"
echo -e "${CLR_YELLOW}${MSG}${CLR_RESET}" echo -e "${CLR_YELLOW}${MSG}${CLR_RESET}"
} }
# Wrap text in green color
function __color_green() function __color_green()
{ {
local MSG="$1" local MSG="$1"
echo -e "${CLR_GREEN}${MSG}${CLR_RESET}" echo -e "${CLR_GREEN}${MSG}${CLR_RESET}"
} }
# Wrap text in blue color
function __color_blue() function __color_blue()
{ {
local MSG="$1" local MSG="$1"
@@ -48,36 +52,43 @@ function __color_blue()
# │ Helpers │ # │ Helpers │
# ╰──────────────────────────────────────────────────────────╯ # ╰──────────────────────────────────────────────────────────╯
# Print blue arrow marker
function __log_marker() function __log_marker()
{ {
echo -e "${CLR_BLUE}➜${CLR_RESET}" echo -e "${CLR_BLUE}➜${CLR_RESET}"
} }
# Print green checkmark marker
function __log_marker_ok() function __log_marker_ok()
{ {
echo -e "${CLR_GREEN}✔${CLR_RESET}" echo -e "${CLR_GREEN}✔${CLR_RESET}"
} }
# Print blue checkmark marker
function __log_marker_ok_blue() function __log_marker_ok_blue()
{ {
echo -e "${CLR_BLUE}✔${CLR_RESET}" echo -e "${CLR_BLUE}✔${CLR_RESET}"
} }
# Print yellow warning marker
function __log_marker_warn() function __log_marker_warn()
{ {
echo -e "${CLR_YELLOW}⁕${CLR_RESET}" echo -e "${CLR_YELLOW}⁕${CLR_RESET}"
} }
# Print yellow question marker
function __log_marker_question() function __log_marker_question()
{ {
echo -e "${CLR_YELLOW}?${CLR_RESET}" echo -e "${CLR_YELLOW}?${CLR_RESET}"
} }
# Print red error marker
function __log_marker_err() function __log_marker_err()
{ {
echo -e "${CLR_RED}⛌${CLR_RESET}" echo -e "${CLR_RED}⛌${CLR_RESET}"
} }
# Print indentation spacing
function __log_indent() function __log_indent()
{ {
echo " " echo " "
@@ -87,71 +98,85 @@ function __log_indent()
# │ Log functions │ # │ Log functions │
# ╰──────────────────────────────────────────────────────────╯ # ╰──────────────────────────────────────────────────────────╯
# Print a message with arrow marker
function msg() function msg()
{ {
echo -e "$(__log_marker) $1" echo -e "$(__log_marker) $1"
} }
# Print a celebration message
function msg_yay() function msg_yay()
{ {
echo -e "🎉 $1" echo -e "🎉 $1"
} }
# Print a celebration message with checkmark
function msg_yay_done() function msg_yay_done()
{ {
echo -e "🎉 $1 ...$(__log_marker_ok)" echo -e "🎉 $1 ...$(__log_marker_ok)"
} }
# Print a message with completion checkmark
function msg_done() function msg_done()
{ {
echo -e "$(__log_marker) $1 ...$(__log_marker_ok)" echo -e "$(__log_marker) $1 ...$(__log_marker_ok)"
} }
# Print a completion checkmark suffix
function msg_done_suffix() function msg_done_suffix()
{ {
echo -e "$(__log_marker) ...$(__log_marker_ok)" echo -e "$(__log_marker) ...$(__log_marker_ok)"
} }
# Print a prompt-style message
function msg_prompt() function msg_prompt()
{ {
echo -e "$(__log_marker_question) $1" echo -e "$(__log_marker_question) $1"
} }
# Print a prompt message with checkmark
function msg_prompt_done() function msg_prompt_done()
{ {
echo -e "$(__log_marker_question) $1 ...$(__log_marker_ok)" echo -e "$(__log_marker_question) $1 ...$(__log_marker_ok)"
} }
# Print an indented message
function msg_nested() function msg_nested()
{ {
echo -e "$(__log_indent)$(__log_marker) $1" echo -e "$(__log_indent)$(__log_marker) $1"
} }
# Print an indented message with checkmark
function msg_nested_done() function msg_nested_done()
{ {
echo -e "$(__log_indent)$(__log_marker) $1 ...$(__log_marker_ok)" echo -e "$(__log_indent)$(__log_marker) $1 ...$(__log_marker_ok)"
} }
# Print a running-task message in green
function msg_run() function msg_run()
{ {
echo -e "${CLR_GREEN}➜ $1${CLR_RESET} $2" echo -e "${CLR_GREEN}➜ $1${CLR_RESET} $2"
} }
# Print a running-task message with checkmark
function msg_run_done() function msg_run_done()
{ {
echo -e "${CLR_GREEN}➜ $1${CLR_RESET} $2 ...$(__log_marker_ok)" echo -e "${CLR_GREEN}➜ $1${CLR_RESET} $2 ...$(__log_marker_ok)"
} }
# Print an ok/success message
function msg_ok() function msg_ok()
{ {
echo -e "$(__log_marker_ok) $1" echo -e "$(__log_marker_ok) $1"
} }
# Print a warning message
function msg_warn() function msg_warn()
{ {
echo -e "$(__log_marker_warn) $1" echo -e "$(__log_marker_warn) $1"
} }
# Print an error message
function msg_err() function msg_err()
{ {
echo -e "$(__log_marker_err) $1" echo -e "$(__log_marker_err) $1"
@@ -174,6 +199,7 @@ ask()
# If this is being sourced, no need to run the next steps. # If this is being sourced, no need to run the next steps.
[ "$sourced" = 1 ] && return [ "$sourced" = 1 ] && return
# Run visual tests for all message types
function __tests() function __tests()
{ {
msg "[ msg ]" msg "[ msg ]"
@@ -192,6 +218,7 @@ function __tests()
msg_yay_done "[ yay_done ]" msg_yay_done "[ yay_done ]"
} }
# Show usage information and examples
function usage() function usage()
{ {
echo "usage: msgr [type] [message] [optional second message]" echo "usage: msgr [type] [message] [optional second message]"

View File

@@ -19,7 +19,7 @@ set -euo pipefail # Add error handling
LATEST_VERSION_FORMULA="php" # The formula name for latest PHP version LATEST_VERSION_FORMULA="php" # The formula name for latest PHP version
PHP_VERSION_FILE=".php-version" # File name to look for when auto-switching PHP_VERSION_FILE=".php-version" # File name to look for when auto-switching
# Switch brew php version # Verify that Homebrew is installed
function check_dependencies() function check_dependencies()
{ {
if ! command -v brew > /dev/null 2>&1; then if ! command -v brew > /dev/null 2>&1; then
@@ -28,6 +28,7 @@ function check_dependencies()
fi fi
} }
# Display help message and usage examples
function usage() function usage()
{ {
echo "Brew PHP Switcher - Switch between PHP versions installed via Homebrew" echo "Brew PHP Switcher - Switch between PHP versions installed via Homebrew"
@@ -53,6 +54,7 @@ function usage()
exit 0 exit 0
} }
# List all PHP versions installed via Homebrew
function list_php_versions() function list_php_versions()
{ {
# Check Homebrew's installation path for PHP versions # Check Homebrew's installation path for PHP versions
@@ -185,6 +187,7 @@ function list_php_versions()
done done
} }
# Convert a version number to a Homebrew formula name
function get_php_formula_for_version() function get_php_formula_for_version()
{ {
local version="$1" local version="$1"
@@ -199,6 +202,7 @@ function get_php_formula_for_version()
echo "php@$version" echo "php@$version"
} }
# Check if a Homebrew formula is installed
function check_formula_installed() function check_formula_installed()
{ {
local formula="$1" local formula="$1"
@@ -216,6 +220,7 @@ function check_formula_installed()
return 1 return 1
} }
# Unlink the currently active PHP version
function unlink_current_php() function unlink_current_php()
{ {
local current_formula="" local current_formula=""
@@ -241,6 +246,7 @@ function unlink_current_php()
fi fi
} }
# Link a specific PHP formula as the active version
function link_php_version() function link_php_version()
{ {
local formula="$1" local formula="$1"
@@ -265,6 +271,7 @@ function link_php_version()
fi fi
} }
# Display the currently active PHP version
function get_current_version() function get_current_version()
{ {
if ! command -v php > /dev/null 2>&1; then if ! command -v php > /dev/null 2>&1; then
@@ -300,6 +307,7 @@ function get_current_version()
fi fi
} }
# Validate PHP version format (x.y or latest)
function validate_version() function validate_version()
{ {
local version="$1" local version="$1"
@@ -312,6 +320,7 @@ function validate_version()
fi fi
} }
# Search for .php-version file in directory hierarchy
function find_php_version_file() function find_php_version_file()
{ {
local dir="$PWD" local dir="$PWD"
@@ -334,6 +343,7 @@ function find_php_version_file()
return 1 return 1
} }
# Auto-switch PHP based on .php-version file
function auto_switch_php_version() function auto_switch_php_version()
{ {
local version_file local version_file
@@ -360,6 +370,7 @@ function auto_switch_php_version()
switch_php_version "$version" switch_php_version "$version"
} }
# Switch to a specific PHP version
function switch_php_version() function switch_php_version()
{ {
local version="$1" local version="$1"
@@ -398,6 +409,7 @@ function switch_php_version()
echo "PHP executable: $(command -v php)" echo "PHP executable: $(command -v php)"
} }
# Parse arguments and dispatch to appropriate action
function main() function main()
{ {
local version="" local version=""

View File

@@ -5,6 +5,7 @@
# #
# Modified by Ismo Vuorinen <https://github.com/ivuorinen> 2023 # Modified by Ismo Vuorinen <https://github.com/ivuorinen> 2023
# Display usage information for pushover
__pushover_usage() __pushover_usage()
{ {
printf "pushover <options> <message>\n" printf "pushover <options> <message>\n"
@@ -23,6 +24,7 @@ __pushover_usage()
return 1 return 1
} }
# Format an optional curl form field
__pushover_opt_field() __pushover_opt_field()
{ {
field=$1 field=$1
@@ -33,6 +35,7 @@ __pushover_opt_field()
fi fi
} }
# Send a pushover notification via curl
__pushover_send_message() __pushover_send_message()
{ {
device="${1:-}" device="${1:-}"

View File

@@ -10,6 +10,7 @@ VERSION="1.0.0"
LANG_MAP="c:.c,.h|cpp:.cpp,.cc,.cxx,.hpp,.hxx|csharp:.cs|go:.go|java:.java| LANG_MAP="c:.c,.h|cpp:.cpp,.cc,.cxx,.hpp,.hxx|csharp:.cs|go:.go|java:.java|
javascript:.js,.jsx,.mjs,.ts,.tsx|python:.py|ruby:.rb|swift:.swift" javascript:.js,.jsx,.mjs,.ts,.tsx|python:.py|ruby:.rb|swift:.swift"
# Display usage information and options
usage() usage()
{ {
cat << EOF cat << EOF
@@ -24,22 +25,26 @@ EOF
exit "${1:-0}" exit "${1:-0}"
} }
# Log a timestamped message to stderr
log() log()
{ {
printf '[%s] %s\n' "$(date '+%H:%M:%S')" "$*" >&2 printf '[%s] %s\n' "$(date '+%H:%M:%S')" "$*" >&2
} }
# Log an error message and exit
err() err()
{ {
log "ERROR: $*" log "ERROR: $*"
exit 1 exit 1
} }
# Verify codeql binary is available in PATH
check_codeql() check_codeql()
{ {
command -v codeql > /dev/null 2>&1 || err "codeql binary not found in PATH" command -v codeql > /dev/null 2>&1 || err "codeql binary not found in PATH"
log "Found codeql: $(codeql version --format=terse)" log "Found codeql: $(codeql version --format=terse)"
} }
# Get or create the CodeQL cache directory
get_cache_dir() get_cache_dir()
{ {
cache="${XDG_CACHE_HOME:-$HOME/.cache}/codeql" cache="${XDG_CACHE_HOME:-$HOME/.cache}/codeql"
@@ -47,6 +52,7 @@ get_cache_dir()
printf '%s' "$cache" printf '%s' "$cache"
} }
# Detect supported programming languages in source path
detect_languages() detect_languages()
{ {
src_path="$1" src_path="$1"
@@ -85,6 +91,7 @@ detect_languages()
printf '%s' "$detected" | tr ' ' '\n' | sort -u | tr '\n' ' ' | sed 's/ $//' printf '%s' "$detected" | tr ' ' '\n' | sort -u | tr '\n' ' ' | sed 's/ $//'
} }
# Create a CodeQL database for a language
create_database() create_database()
{ {
lang="$1" lang="$1"
@@ -98,6 +105,7 @@ create_database()
--overwrite --overwrite
} }
# Display analysis result statistics from SARIF file
show_results_stats() show_results_stats()
{ {
sarif_file="$1" sarif_file="$1"
@@ -126,6 +134,7 @@ show_results_stats()
return 0 return 0
} }
# Run CodeQL analysis for a single language
analyze_language() analyze_language()
{ {
lang="$1" lang="$1"
@@ -172,6 +181,7 @@ analyze_language()
rm -rf "$db_path" rm -rf "$db_path"
} }
# Parse arguments and run CodeQL analysis pipeline
main() main()
{ {
src_path="." src_path="."

View File

@@ -63,7 +63,7 @@ def test():
except KeyError: except KeyError:
pass pass
else: else:
assert False, "invalid operator did not raise" raise AssertionError("invalid operator did not raise")
if __name__ == "__main__": if __name__ == "__main__":

View File

@@ -190,6 +190,7 @@ get_custom_group()
return 1 return 1
} }
# Check if a key matches the skipped keys list
is_skipped() is_skipped()
{ {
local key=$1 local key=$1

View File

@@ -1,5 +1,4 @@
#!/usr/bin/env python #!/usr/bin/env python
# -*- coding: utf-8 -*-
# Python script to find the largest files in a git repository. # Python script to find the largest files in a git repository.
# The general method is based on the script in this blog post: # The general method is based on the script in this blog post:
@@ -32,36 +31,36 @@
# vim:tw=120:ts=4:ft=python:norl: # vim:tw=120:ts=4:ft=python:norl:
from subprocess import check_output, Popen, PIPE
import argparse import argparse
import glob
import signal import signal
import sys import sys
from subprocess import PIPE, Popen, check_output # nosec B404
sortByOnDiskSize = False sortByOnDiskSize = False
class Blob(object):
sha1 = '' class Blob:
sha1 = ""
size = 0 size = 0
packed_size = 0 packed_size = 0
path = '' path = ""
def __init__(self, line): def __init__(self, line):
cols = line.split() cols = line.split()
self.sha1, self.size, self.packed_size = cols[0], int(cols[2]), int(cols[3]) self.sha1, self.size, self.packed_size = cols[0], int(cols[2]), int(cols[3])
def __repr__(self): def __repr__(self):
return '{} - {} - {} - {}'.format( return f"{self.sha1} - {self.size} - {self.packed_size} - {self.path}"
self.sha1, self.size, self.packed_size, self.path)
def __lt__(self, other): def __lt__(self, other):
if (sortByOnDiskSize): if sortByOnDiskSize:
return self.size < other.size return self.size < other.size
else: else:
return self.packed_size < other.packed_size return self.packed_size < other.packed_size
def csv_line(self): def csv_line(self):
return "{},{},{},{}".format( return f"{self.size / 1024},{self.packed_size / 1024},{self.sha1},{self.path}"
self.size/1024, self.packed_size/1024, self.sha1, self.path)
def main(): def main():
@@ -74,9 +73,9 @@ def main():
size_limit = 1024 * args.filesExceeding size_limit = 1024 * args.filesExceeding
if args.filesExceeding > 0: if args.filesExceeding > 0:
print("Finding objects larger than {}kB…".format(args.filesExceeding)) print(f"Finding objects larger than {args.filesExceeding}kB…")
else: else:
print("Finding the {} largest objects…".format(args.matchCount)) print(f"Finding the {args.matchCount} largest objects…")
blobs = get_top_blobs(args.matchCount, size_limit) blobs = get_top_blobs(args.matchCount, size_limit)
@@ -99,12 +98,29 @@ def get_top_blobs(count, size_limit):
if sortByOnDiskSize: if sortByOnDiskSize:
sort_column = 3 sort_column = 3
verify_pack = "git verify-pack -v `git rev-parse --git-dir`/objects/pack/pack-*.idx | grep blob | sort -k{}nr".format(sort_column) # noqa: E501 git_dir = check_output(["git", "rev-parse", "--git-dir"]).decode("utf-8").strip() # nosec B603 # nosemgrep
output = check_output(verify_pack, shell=True).decode('utf-8').strip().split("\n")[:-1] # noqa: E501 idx_files = glob.glob(f"{git_dir}/objects/pack/pack-*.idx")
verify_pack = Popen( # nosec B603
["git", "verify-pack", "-v", *idx_files],
stdout=PIPE,
stderr=PIPE,
)
grep_blob = Popen(["grep", "blob"], stdin=verify_pack.stdout, stdout=PIPE, stderr=PIPE) # nosec B603
if verify_pack.stdout:
verify_pack.stdout.close()
sort_cmd = Popen( # nosec B603
["sort", f"-k{sort_column}nr"],
stdin=grep_blob.stdout,
stdout=PIPE,
stderr=PIPE,
)
if grep_blob.stdout:
grep_blob.stdout.close()
output = [line for line in sort_cmd.communicate()[0].decode("utf-8").strip().split("\n") if line]
blobs = dict() blobs = {}
# use __lt__ to do the appropriate comparison # use __lt__ to do the appropriate comparison
compare_blob = Blob("a b {} {} c".format(size_limit, size_limit)) compare_blob = Blob(f"a b {size_limit} {size_limit} c")
for obj_line in output: for obj_line in output:
blob = Blob(obj_line) blob = Blob(obj_line)
@@ -132,15 +148,18 @@ def populate_blob_paths(blobs):
print("Finding object paths…") print("Finding object paths…")
# Only include revs which have a path. Other revs aren't blobs. # Only include revs which have a path. Other revs aren't blobs.
rev_list = "git rev-list --all --objects | awk '$2 {print}'" rev_list = Popen(["git", "rev-list", "--all", "--objects"], stdout=PIPE, stderr=PIPE) # nosec B603
all_object_lines = check_output(rev_list, shell=True).decode('utf-8').strip().split("\n")[:-1] # noqa: E501 awk_filter = Popen(["awk", "$2 {print}"], stdin=rev_list.stdout, stdout=PIPE, stderr=PIPE) # nosec B603
if rev_list.stdout:
rev_list.stdout.close()
all_object_lines = [line for line in awk_filter.communicate()[0].decode("utf-8").strip().split("\n") if line]
outstanding_keys = list(blobs.keys()) outstanding_keys = list(blobs.keys())
for line in all_object_lines: for line in all_object_lines:
cols = line.split() cols = line.split()
sha1, path = cols[0], " ".join(cols[1:]) sha1, path = cols[0], " ".join(cols[1:])
if (sha1 in outstanding_keys): if sha1 in outstanding_keys:
outstanding_keys.remove(sha1) outstanding_keys.remove(sha1)
blobs[sha1].path = path blobs[sha1].path = path
@@ -164,39 +183,50 @@ def print_out_blobs(blobs):
stdout, _ = p.communicate(input_data) stdout, _ = p.communicate(input_data)
print("\nAll sizes in kB. The pack column is the compressed size of the object inside the pack file.\n") # noqa: E501 print("\nAll sizes in kB. The pack column is the compressed size of the object inside the pack file.\n")
print(stdout.decode("utf-8").rstrip('\n')) print(stdout.decode("utf-8").rstrip("\n"))
else: else:
print("No files found which match those criteria.") print("No files found which match those criteria.")
def parse_arguments(): def parse_arguments():
parser = argparse.ArgumentParser( parser = argparse.ArgumentParser(description="List the largest files in a git repository")
description='List the largest files in a git repository' parser.add_argument(
"-c",
"--match-count",
dest="matchCount",
type=int,
default=10,
help="Files to return. Default is 10. Ignored if --files-exceeding is used.",
) )
parser.add_argument( parser.add_argument(
'-c', '--match-count', dest='matchCount', type=int, default=10, "--files-exceeding",
help='Files to return. Default is 10. Ignored if --files-exceeding is used.' dest="filesExceeding",
type=int,
default=0,
help=(
"The cutoff amount, in KB. Files with a pack size"
" (or physical size, with -p) larger than this will be printed."
),
) )
parser.add_argument( parser.add_argument(
'--files-exceeding', dest='filesExceeding', type=int, default=0, "-p",
help='The cutoff amount, in KB. Files with a pack size (or physical size, with -p) larger than this will be printed.' # noqa: E501 "--physical-sort",
) dest="sortByOnDiskSize",
parser.add_argument( action="store_true",
'-p', '--physical-sort', dest='sortByOnDiskSize', default=False,
action='store_true', default=False, help="Sort by the on-disk size. Default is to sort by the pack size.",
help='Sort by the on-disk size. Default is to sort by the pack size.'
) )
return parser.parse_args() return parser.parse_args()
def signal_handler(signal, frame): def signal_handler(_signal, _frame):
print('Caught Ctrl-C. Exiting.') print("Caught Ctrl-C. Exiting.")
sys.exit(0) sys.exit(0)
# Default function is main() # Default function is main()
if __name__ == '__main__': if __name__ == "__main__":
main() main()

View File

@@ -41,6 +41,7 @@ LOOP=0
SLEEP=1 SLEEP=1
TIMEOUT=5 TIMEOUT=5
# Display usage information and options
usage() usage()
{ {
echo "Usage: $0 [--loop|--forever] [--sleep=N] hostname1 hostname2 ..." echo "Usage: $0 [--loop|--forever] [--sleep=N] hostname1 hostname2 ..."

View File

@@ -39,16 +39,19 @@ log_error()
{ {
echo -e "${RED}ERROR:${NC} $1" >&2 echo -e "${RED}ERROR:${NC} $1" >&2
} }
# Log a warning message
log_warn() log_warn()
{ {
echo -e "${YELLOW}WARN:${NC} $1" >&2 echo -e "${YELLOW}WARN:${NC} $1" >&2
} }
# Log an informational message
log_info() log_info()
{ {
if [[ "${INFO:-0}" == "1" ]]; then if [[ "${INFO:-0}" == "1" ]]; then
echo -e "${GREEN}INFO:${NC} $1" >&2 echo -e "${GREEN}INFO:${NC} $1" >&2
fi fi
} }
# Log a debug message
log_debug() log_debug()
{ {
if [[ "${DEBUG:-0}" == "1" ]]; then if [[ "${DEBUG:-0}" == "1" ]]; then

626
local/bin/x-sonarcloud Executable file
View File

@@ -0,0 +1,626 @@
#!/usr/bin/env bash
# x-sonarcloud - Fetch SonarCloud issues for LLM analysis
# Copyright (c) 2025 - Licensed under MIT
#
# Usage:
# x-sonarcloud # Auto-detect, all open issues
# x-sonarcloud --pr <number> # PR-specific issues
# x-sonarcloud --branch <name> # Branch-specific issues
# x-sonarcloud --org <org> --project-key <key> # Explicit project
# x-sonarcloud --severities BLOCKER,CRITICAL # Filter by severity
# x-sonarcloud --types BUG,VULNERABILITY # Filter by type
# x-sonarcloud --statuses OPEN,CONFIRMED # Filter by status
# x-sonarcloud --resolved # Include resolved issues
# x-sonarcloud -h|--help # Show this help
#
# Examples:
# x-sonarcloud # All open issues in project
# x-sonarcloud --pr 42 # Issues on PR #42
# x-sonarcloud --branch main # Issues on main branch
# x-sonarcloud --severities BLOCKER --types BUG # Only blocker bugs
#
# Requirements:
# - curl and jq installed
# - SONAR_TOKEN environment variable set
# - sonar-project.properties or .sonarlint/connectedMode.json for auto-detection
set -euo pipefail
# Colors for output (stderr only)
readonly RED='\033[0;31m'
readonly GREEN='\033[0;32m'
readonly YELLOW='\033[1;33m'
readonly BLUE='\033[0;34m'
readonly NC='\033[0m' # No Color
# API constants
readonly MAX_PAGE_SIZE=500
readonly MAX_TOTAL_ISSUES=10000
# Show usage information
show_usage()
{
sed -n '3,27p' "$0" | sed 's/^# //' | sed 's/^#//'
}
# Log functions
log_error()
{
echo -e "${RED}ERROR:${NC} $1" >&2
}
# Log a warning message
log_warn()
{
echo -e "${YELLOW}WARN:${NC} $1" >&2
}
# Log an informational message
log_info()
{
if [[ "${INFO:-0}" == "1" ]]; then
echo -e "${GREEN}INFO:${NC} $1" >&2
fi
}
# Log a debug message
log_debug()
{
if [[ "${DEBUG:-0}" == "1" ]]; then
echo -e "${BLUE}DEBUG:${NC} $1" >&2
fi
}
# Check required dependencies
check_dependencies()
{
local missing=0
if ! command -v curl &> /dev/null; then
log_error "curl is not installed. Install it with your package manager."
missing=1
fi
if ! command -v jq &> /dev/null; then
log_error "jq is not installed. Install it with your package manager:"
log_error " https://jqlang.github.io/jq/download/"
missing=1
fi
if [[ "$missing" -eq 1 ]]; then
exit 1
fi
}
# Check authentication
check_auth()
{
if [[ -z "${SONAR_TOKEN:-}" ]]; then
log_error "SONAR_TOKEN environment variable is not set."
log_error "Generate a token at: https://sonarcloud.io/account/security"
log_error "Then export it: export SONAR_TOKEN=your_token_here"
exit 1
fi
}
# Detect project from sonar-project.properties
detect_project_from_properties()
{
local props_file="sonar-project.properties"
if [[ ! -f "$props_file" ]]; then
return 1
fi
local org key
org=$(grep -E '^sonar\.organization=' "$props_file" 2> /dev/null | cut -d'=' -f2- || echo "")
key=$(grep -E '^sonar\.projectKey=' "$props_file" 2> /dev/null | cut -d'=' -f2- || echo "")
if [[ -n "$org" && -n "$key" ]]; then
log_debug "Detected from sonar-project.properties: org=$org key=$key"
echo "$org" "$key" ""
return 0
fi
return 1
}
# Detect project from .sonarlint/connectedMode.json
detect_project_from_sonarlint()
{
local sonarlint_file=".sonarlint/connectedMode.json"
if [[ ! -f "$sonarlint_file" ]]; then
return 1
fi
local org key region
org=$(jq -r '.sonarCloudOrganization // empty' "$sonarlint_file" 2> /dev/null || echo "")
key=$(jq -r '.projectKey // empty' "$sonarlint_file" 2> /dev/null || echo "")
region=$(jq -r '.region // empty' "$sonarlint_file" 2> /dev/null || echo "")
if [[ -n "$org" && -n "$key" ]]; then
log_debug "Detected from .sonarlint/connectedMode.json: org=$org key=$key region=$region"
echo "$org" "$key" "$region"
return 0
fi
return 1
}
# Orchestrate project detection in priority order
detect_project()
{
local result
# 1. sonar-project.properties
if result=$(detect_project_from_properties); then
echo "$result"
return 0
fi
# 2. .sonarlint/connectedMode.json
if result=$(detect_project_from_sonarlint); then
echo "$result"
return 0
fi
# No config found
log_error "Could not auto-detect SonarCloud project configuration."
log_error "Provide one of the following:"
log_error " 1. sonar-project.properties with sonar.organization and sonar.projectKey"
log_error " 2. .sonarlint/connectedMode.json with sonarCloudOrganization and projectKey"
log_error " 3. CLI flags: --org <org> --project-key <key>"
return 1
}
# Get API base URL (currently same for all regions)
get_base_url()
{
echo "https://sonarcloud.io"
}
# Make an authenticated SonarCloud API request
sonar_api_request()
{
local url="$1"
log_debug "API request: $url"
local http_code body response
response=$(curl -s -w "\n%{http_code}" \
-H "Authorization: Bearer $SONAR_TOKEN" \
"$url" 2> /dev/null) || {
log_error "curl request failed for: $url"
return 1
}
http_code=$(echo "$response" | tail -n1)
body=$(echo "$response" | sed '$d')
case "$http_code" in
200)
echo "$body"
return 0
;;
401)
log_error "Authentication failed (HTTP 401). Check your SONAR_TOKEN."
return 1
;;
403)
log_error "Access forbidden (HTTP 403). Token may lack required permissions."
return 1
;;
404)
log_error "Not found (HTTP 404). Check organization and project key."
return 1
;;
429)
log_error "Rate limited (HTTP 429). Wait before retrying."
return 1
;;
*)
log_error "API request failed with HTTP $http_code"
log_debug "Response body: $body"
return 1
;;
esac
}
# Fetch a single page of issues
fetch_issues_page()
{
local base_url="$1"
local project_key="$2"
local page="$3"
local pr_number="${4:-}"
local branch="${5:-}"
local severities="${6:-}"
local types="${7:-}"
local statuses="${8:-}"
local resolved="${9:-}"
local url="${base_url}/api/issues/search?componentKeys=${project_key}"
url="${url}&p=${page}&ps=${MAX_PAGE_SIZE}"
if [[ -n "$pr_number" ]]; then
url="${url}&pullRequest=${pr_number}"
fi
if [[ -n "$branch" ]]; then
url="${url}&branch=${branch}"
fi
if [[ -n "$severities" ]]; then
url="${url}&severities=${severities}"
fi
if [[ -n "$types" ]]; then
url="${url}&types=${types}"
fi
if [[ -n "$statuses" ]]; then
url="${url}&statuses=${statuses}"
fi
if [[ -n "$resolved" ]]; then
url="${url}&resolved=${resolved}"
fi
sonar_api_request "$url"
}
# Fetch all issues with pagination
fetch_all_issues()
{
local base_url="$1"
local project_key="$2"
local pr_number="${3:-}"
local branch="${4:-}"
local severities="${5:-}"
local types="${6:-}"
local statuses="${7:-}"
local resolved="${8:-}"
local page=1
local all_issues="[]"
local total=0
while true; do
log_info "Fetching issues page $page..."
local response
response=$(fetch_issues_page "$base_url" "$project_key" "$page" \
"$pr_number" "$branch" "$severities" "$types" "$statuses" "$resolved") || return 1
local page_issues page_total
page_issues=$(echo "$response" | jq '.issues // []' 2> /dev/null || echo "[]")
page_total=$(echo "$response" | jq '.total // 0' 2> /dev/null || echo "0")
local page_count
page_count=$(echo "$page_issues" | jq 'length' 2> /dev/null || echo "0")
log_debug "Page $page: $page_count issues (total available: $page_total)"
# Merge into accumulated results
all_issues=$(echo "$all_issues" "$page_issues" | jq -s '.[0] + .[1]' 2> /dev/null || echo "$all_issues")
total=$(echo "$all_issues" | jq 'length' 2> /dev/null || echo "0")
# Check if we have all issues or hit the cap
if [[ "$page_count" -lt "$MAX_PAGE_SIZE" ]]; then
break
fi
if [[ "$total" -ge "$MAX_TOTAL_ISSUES" ]]; then
log_warn "Reached maximum of $MAX_TOTAL_ISSUES issues. Results may be incomplete."
break
fi
page=$((page + 1))
done
log_info "Fetched $total issues total"
echo "$all_issues"
}
# Format issues grouped by severity then by file
format_issues_by_severity()
{
local issues="$1"
local base_url="$2"
local org="$3"
local project_key="$4"
echo "$issues" | jq -r --arg base_url "$base_url" --arg org "$org" --arg key "$project_key" '
group_by(.severity) | sort_by(-(
if .[0].severity == "BLOCKER" then 5
elif .[0].severity == "CRITICAL" then 4
elif .[0].severity == "MAJOR" then 3
elif .[0].severity == "MINOR" then 2
elif .[0].severity == "INFO" then 1
else 0 end
)) | .[] |
"### Severity: \(.[0].severity)\n" +
(
group_by(.component) | .[] |
"#### File: \(.[0].component | split(":") | if length > 1 then .[1:] | join(":") else .[0] end)\n" +
(
[.[] |
"##### Issue: \(.message)\n" +
"- **Rule:** \(.rule)\n" +
"- **Type:** \(.type)\n" +
"- **Severity:** \(.severity)\n" +
"- **Status:** \(.status)\n" +
"- **Line:** \(.line // "N/A")\n" +
"- **Effort:** \(.effort // "N/A")\n" +
"- **Created:** \(.creationDate // "N/A")\n" +
"- **URL:** \($base_url)/project/issues?open=\(.key)&id=\($key)\n"
] | join("\n")
)
)
' 2> /dev/null || echo "Error formatting issues."
}
# Format summary counts
format_summary()
{
local issues="$1"
echo "### By Severity"
echo ""
echo "$issues" | jq -r '
group_by(.severity) | .[] |
"- **\(.[0].severity):** \(length)"
' 2> /dev/null || echo "- Error computing severity counts"
echo ""
echo "### By Type"
echo ""
echo "$issues" | jq -r '
group_by(.type) | .[] |
"- **\(.[0].type):** \(length)"
' 2> /dev/null || echo "- Error computing type counts"
echo ""
echo "### Total"
echo ""
local count
count=$(echo "$issues" | jq 'length' 2> /dev/null || echo "0")
echo "- **Total issues:** $count"
}
# Format the full markdown output
format_output()
{
local org="$1"
local project_key="$2"
local mode="$3"
local mode_value="$4"
local base_url="$5"
local issues="$6"
local issue_count
issue_count=$(echo "$issues" | jq 'length' 2> /dev/null || echo "0")
# Header and LLM instructions
cat << 'EOF'
# SonarCloud Issues Analysis Report
## LLM Processing Instructions
You are analyzing code quality issues from SonarCloud for this project.
**Your tasks:**
1. **Triage**: Review each issue and assess its real impact on the codebase
2. **Priority Assessment**: Rank issues by severity and likelihood of causing problems
3. **Code Verification**: Check the actual source code to confirm each issue is valid
4. **Root Cause Analysis**: Identify why the issue exists and what pattern caused it
5. **Implementation Plan**: Create actionable fix tasks grouped by file for efficiency
6. **False Positive Detection**: Flag issues that appear to be false positives with reasoning
**Tools to use:**
- `find`, `cat`, `rg` commands and available tools to examine current codebase
- `git log` and `git blame` to understand code history and authorship
- File system tools to verify mentioned files exist and check current state
EOF
# Project information
cat << EOF
## Project Information
- **Organization:** $org
- **Project Key:** $project_key
EOF
case "$mode" in
pr)
echo "- **Mode:** Pull Request #$mode_value"
echo "- **URL:** ${base_url}/project/issues?pullRequest=${mode_value}&id=${project_key}"
;;
branch)
echo "- **Mode:** Branch \`$mode_value\`"
echo "- **URL:** ${base_url}/project/issues?branch=${mode_value}&id=${project_key}"
;;
*)
echo "- **Mode:** Project (all open issues)"
echo "- **URL:** ${base_url}/project/issues?id=${project_key}"
;;
esac
echo "- **Dashboard:** ${base_url}/project/overview?id=${project_key}"
# Issues section
echo ""
echo "## Issues ($issue_count total)"
echo ""
if [[ "$issue_count" -eq 0 ]]; then
echo "No issues found matching the specified filters."
else
format_issues_by_severity "$issues" "$base_url" "$org" "$project_key"
echo ""
echo "## Summary"
echo ""
format_summary "$issues"
fi
# Footer
cat << 'EOF'
## Next Steps for LLM Analysis
1. **Validate against current code:**
- Check if mentioned files and lines still match the reported issues
- Verify issues are not already fixed in the current branch
- Identify false positives and explain why they are false positives
2. **Prioritize fixes:**
- Address BLOCKER and CRITICAL severity issues first
- Group fixes by file to minimize context switching
- Consider effort estimates when planning the fix order
3. **Group by file for implementation:**
- Batch changes to the same file together
- Consider dependencies between fixes
- Plan atomic commits per logical change group
4. **Track progress:**
- Use todo lists and memory tools to track which issues are addressed
- Mark false positives with clear reasoning
- Verify fixes do not introduce new issues
EOF
}
# Main pipeline: fetch and display issues
fetch_and_display_issues()
{
local org="$1"
local project_key="$2"
local mode="$3"
local mode_value="$4"
local severities="${5:-}"
local types="${6:-}"
local statuses="${7:-}"
local resolved="${8:-}"
local base_url
base_url=$(get_base_url)
local pr_number=""
local branch=""
case "$mode" in
pr)
pr_number="$mode_value"
;;
branch)
branch="$mode_value"
;;
esac
log_info "Fetching SonarCloud issues for $project_key (mode: $mode)..."
local issues
issues=$(fetch_all_issues "$base_url" "$project_key" \
"$pr_number" "$branch" "$severities" "$types" "$statuses" "$resolved") || {
log_error "Failed to fetch issues"
return 1
}
format_output "$org" "$project_key" "$mode" "$mode_value" "$base_url" "$issues"
}
# Main function
main()
{
local org=""
local project_key=""
local mode="project"
local mode_value=""
local severities=""
local types=""
local statuses="OPEN,CONFIRMED,REOPENED"
local resolved="false"
# Parse arguments
while [[ $# -gt 0 ]]; do
case "$1" in
-h | --help)
show_usage
exit 0
;;
--pr)
mode="pr"
mode_value="${2:?Missing PR number after --pr}"
shift 2
;;
--branch)
mode="branch"
mode_value="${2:?Missing branch name after --branch}"
shift 2
;;
--org)
org="${2:?Missing organization after --org}"
shift 2
;;
--project-key)
project_key="${2:?Missing project key after --project-key}"
shift 2
;;
--severities)
severities="${2:?Missing severities after --severities}"
shift 2
;;
--types)
types="${2:?Missing types after --types}"
shift 2
;;
--statuses)
statuses="${2:?Missing statuses after --statuses}"
shift 2
;;
--resolved)
resolved="true"
statuses=""
shift
;;
*)
log_error "Unknown argument: $1"
show_usage
exit 1
;;
esac
done
check_dependencies
check_auth
# Auto-detect project if not specified via CLI
if [[ -z "$org" || -z "$project_key" ]]; then
local detected
detected=$(detect_project) || exit 1
# shellcheck disable=SC2034 # region reserved for future per-region base URLs
read -r detected_org detected_key detected_region <<< "$detected"
if [[ -z "$org" ]]; then
org="$detected_org"
fi
if [[ -z "$project_key" ]]; then
project_key="$detected_key"
fi
fi
log_debug "Organization: $org"
log_debug "Project Key: $project_key"
log_debug "Mode: $mode"
log_debug "Severities: ${severities:-all}"
log_debug "Types: ${types:-all}"
log_debug "Statuses: ${statuses:-all}"
log_debug "Resolved: $resolved"
fetch_and_display_issues "$org" "$project_key" "$mode" "$mode_value" \
"$severities" "$types" "$statuses" "$resolved"
}
# Run main function with all arguments
main "$@"

46
local/bin/x-sonarcloud.md Normal file
View File

@@ -0,0 +1,46 @@
# x-sonarcloud
---
## Usage
```bash
x-sonarcloud # Auto-detect, all open issues
x-sonarcloud --pr <number> # PR-specific issues
x-sonarcloud --branch <name> # Branch-specific issues
x-sonarcloud --org <org> --project-key <key> # Explicit project
x-sonarcloud --severities BLOCKER,CRITICAL # Filter by severity
x-sonarcloud --types BUG,VULNERABILITY # Filter by type
x-sonarcloud --statuses OPEN,CONFIRMED # Filter by status
x-sonarcloud --resolved # Include resolved issues
x-sonarcloud -h|--help # Show help
```
Fetches SonarCloud code quality issues via REST API and formats them as
structured markdown with LLM processing instructions for automated analysis
and triage.
## Examples
```bash
x-sonarcloud # All open issues in project
x-sonarcloud --pr 42 # Issues on PR #42
x-sonarcloud --branch main # Issues on main branch
x-sonarcloud --severities BLOCKER --types BUG # Only blocker bugs
```
## Requirements
- `curl` and `jq` installed
- `SONAR_TOKEN` environment variable set
(generate at <https://sonarcloud.io/account/security>)
- Project auto-detection via `sonar-project.properties` or
`.sonarlint/connectedMode.json`, or explicit `--org`/`--project-key` flags
## Environment Variables
- `SONAR_TOKEN` — Bearer token for SonarCloud API authentication (required)
- `INFO=1` — Enable informational log messages on stderr
- `DEBUG=1` — Enable debug log messages on stderr
<!-- vim: set ft=markdown spell spelllang=en_us cc=80 : -->

View File

@@ -154,6 +154,7 @@ get_state()
# ERROR HANDLING AND CLEANUP # ERROR HANDLING AND CLEANUP
# ============================================================================ # ============================================================================
# Clean up temporary files and handle exit
cleanup() cleanup()
{ {
exit_code=$? exit_code=$?
@@ -177,6 +178,7 @@ trap cleanup EXIT INT TERM
# LOGGING FUNCTIONS # LOGGING FUNCTIONS
# ============================================================================ # ============================================================================
# Create audit directories and initialize log file
setup_logging() setup_logging()
{ {
# Create all necessary directories # Create all necessary directories
@@ -197,6 +199,7 @@ setup_logging()
} >> "$LOG_FILE" } >> "$LOG_FILE"
} }
# Log a message with timestamp and severity level
log_message() log_message()
{ {
level="$1" level="$1"
@@ -225,6 +228,7 @@ log_message()
# INPUT VALIDATION # INPUT VALIDATION
# ============================================================================ # ============================================================================
# Validate hostname format for SSH connection
validate_hostname() validate_hostname()
{ {
hostname="$1" hostname="$1"
@@ -244,6 +248,7 @@ validate_hostname()
return 0 return 0
} }
# Validate username format for SSH connection
validate_username() validate_username()
{ {
username="$1" username="$1"
@@ -263,6 +268,7 @@ validate_username()
return 0 return 0
} }
# Parse input file into validated host entries
parse_host_list() parse_host_list()
{ {
input_file="$1" input_file="$1"
@@ -309,6 +315,7 @@ parse_host_list()
# SSH CONNECTION FUNCTIONS # SSH CONNECTION FUNCTIONS
# ============================================================================ # ============================================================================
# Execute SSH command with retry logic and key fallback
ssh_with_retry() ssh_with_retry()
{ {
host="$1" host="$1"
@@ -373,6 +380,7 @@ ssh_with_retry()
return 1 return 1
} }
# Verify SSH connectivity to a host
test_ssh_connectivity() test_ssh_connectivity()
{ {
host="$1" host="$1"
@@ -392,6 +400,7 @@ test_ssh_connectivity()
# SSH SECURITY AUDIT FUNCTIONS # SSH SECURITY AUDIT FUNCTIONS
# ============================================================================ # ============================================================================
# Audit SSH daemon configuration on a remote host
check_sshd_config() check_sshd_config()
{ {
host="$1" host="$1"
@@ -451,6 +460,7 @@ check_sshd_config()
# AUTOMATED UPDATES DETECTION # AUTOMATED UPDATES DETECTION
# ============================================================================ # ============================================================================
# Check if automated security updates are enabled
check_automated_updates() check_automated_updates()
{ {
host="$1" host="$1"
@@ -532,6 +542,7 @@ check_automated_updates()
# PENDING REBOOT DETECTION # PENDING REBOOT DETECTION
# ============================================================================ # ============================================================================
# Detect if a remote host requires a reboot
check_pending_reboot() check_pending_reboot()
{ {
host="$1" host="$1"
@@ -602,6 +613,7 @@ check_pending_reboot()
# REMEDIATION FUNCTIONS # REMEDIATION FUNCTIONS
# ============================================================================ # ============================================================================
# Create a timestamped backup of sshd_config
backup_sshd_config() backup_sshd_config()
{ {
host="$1" host="$1"
@@ -616,6 +628,7 @@ backup_sshd_config()
" "$ssh_key" " "$ssh_key"
} }
# Disable password authentication on a remote host
disable_password_auth() disable_password_auth()
{ {
host="$1" host="$1"
@@ -668,6 +681,7 @@ ClientAliveCountMax 2
# REPORTING FUNCTIONS # REPORTING FUNCTIONS
# ============================================================================ # ============================================================================
# Generate CSV report from audit results
generate_csv_report() generate_csv_report()
{ {
report_file="$1" report_file="$1"
@@ -693,6 +707,7 @@ generate_csv_report()
done < "$HOSTS_LIST_FILE" done < "$HOSTS_LIST_FILE"
} }
# Display formatted audit summary to terminal
display_summary() display_summary()
{ {
printf '\n' printf '\n'
@@ -743,6 +758,7 @@ display_summary()
# MAIN AUDIT FUNCTION # MAIN AUDIT FUNCTION
# ============================================================================ # ============================================================================
# Run all audit checks on a single host
audit_host() audit_host()
{ {
host_entry="$1" host_entry="$1"
@@ -788,6 +804,7 @@ audit_host()
# MAIN EXECUTION # MAIN EXECUTION
# ============================================================================ # ============================================================================
# Main entry point: parse args, run audits, generate report
main() main()
{ {
input_file="${1:-}" input_file="${1:-}"

View File

@@ -9,11 +9,13 @@
# <r> <g> <b> range from 0 to 255 inclusive. # <r> <g> <b> range from 0 to 255 inclusive.
# The escape sequence ^[0m returns output to default # The escape sequence ^[0m returns output to default
# Set terminal background to an RGB color
setBackgroundColor() setBackgroundColor()
{ {
echo -en "\x1b[48;2;$1;$2;$3""m" echo -en "\x1b[48;2;$1;$2;$3""m"
} }
# Reset terminal output formatting
resetOutput() resetOutput()
{ {
echo -en "\x1b[0m\n" echo -en "\x1b[0m\n"

View File

@@ -28,6 +28,7 @@
set -euo pipefail set -euo pipefail
# Display usage information and options
usage() usage()
{ {
cat << EOF cat << EOF
@@ -52,6 +53,7 @@ THUMB_SUFFIX="${THUMB_SUFFIX:-_thumb}"
# List of MIME types supported by ImageMagick (adjust as needed) # List of MIME types supported by ImageMagick (adjust as needed)
ALLOWED_MIMETYPES=("image/jpeg" "image/png" "image/gif" "image/bmp" "image/tiff" "image/webp") ALLOWED_MIMETYPES=("image/jpeg" "image/png" "image/gif" "image/bmp" "image/tiff" "image/webp")
# Verify ImageMagick is available
check_magick_installed() check_magick_installed()
{ {
if ! command -v magick &> /dev/null; then if ! command -v magick &> /dev/null; then
@@ -60,6 +62,7 @@ check_magick_installed()
fi fi
} }
# Verify mimetype command is available
check_mimetype_installed() check_mimetype_installed()
{ {
if ! command -v mimetype &> /dev/null; then if ! command -v mimetype &> /dev/null; then
@@ -165,6 +168,7 @@ generate_thumbnails()
done < <(find "$source_dir" -type f -print0) done < <(find "$source_dir" -type f -print0)
} }
# Parse options, validate inputs, and generate thumbnails
main() main()
{ {
parse_options "$@" parse_options "$@"

View File

@@ -26,6 +26,7 @@ if [ "$#" -lt 2 ]; then
exit 1 exit 1
fi fi
# Wait until host stops responding to ping
wait_for_host_down() wait_for_host_down()
{ {
local host=$1 local host=$1
@@ -37,6 +38,7 @@ wait_for_host_down()
done done
} }
# Wait for host to go down then execute command
main() main()
{ {
local host=$1 local host=$1

View File

@@ -30,6 +30,7 @@ if [ "$#" -lt 2 ]; then
exit 1 exit 1
fi fi
# Extract hostname from arguments, handling ssh shortcut
get_host() get_host()
{ {
if [ "$1" = "ssh" ]; then if [ "$1" = "ssh" ]; then
@@ -39,6 +40,7 @@ get_host()
fi fi
} }
# Wait until host responds to ping
wait_for_host() wait_for_host()
{ {
local host=$1 local host=$1
@@ -50,6 +52,7 @@ wait_for_host()
done done
} }
# Wait for host to come online then execute command
main() main()
{ {
local host local host

View File

@@ -9,10 +9,15 @@
"lint:biome": "biome check .", "lint:biome": "biome check .",
"fix:biome": "biome check --write .", "fix:biome": "biome check --write .",
"format": "biome format --write .", "format": "biome format --write .",
"lint:prettier": "prettier --check '**/*.{yml,yaml}'",
"fix:prettier": "prettier --write '**/*.{yml,yaml}'",
"format:yaml": "prettier --write '**/*.{yml,yaml}'",
"test": "bash test-all.sh", "test": "bash test-all.sh",
"lint:ec": "ec -f gcc", "lint:ec": "ec -f gcc",
"lint": "yarn lint:biome && yarn lint:ec", "lint:md-table": "git ls-files '*.md' | xargs markdown-table-formatter --check",
"fix": "yarn fix:biome" "fix:md-table": "git ls-files '*.md' | xargs markdown-table-formatter",
"lint": "yarn lint:biome && yarn lint:prettier && yarn lint:ec && yarn lint:md-table",
"fix": "yarn fix:biome && yarn fix:prettier && yarn fix:md-table"
}, },
"repository": { "repository": {
"type": "git", "type": "git",
@@ -33,6 +38,8 @@
"@types/node": "^24.0.1", "@types/node": "^24.0.1",
"bats": "^1.12.0", "bats": "^1.12.0",
"editorconfig-checker": "^6.1.0", "editorconfig-checker": "^6.1.0",
"markdown-table-formatter": "^1.7.0",
"prettier": "^3.8.1",
"typescript": "^5.8.3" "typescript": "^5.8.3"
}, },
"packageManager": "yarn@4.12.0" "packageManager": "yarn@4.12.0"

9
pyproject.toml Normal file
View File

@@ -0,0 +1,9 @@
[tool.ruff]
target-version = "py39"
line-length = 120
[tool.ruff.lint]
select = ["E", "F", "W", "I", "UP", "B", "SIM", "C4"]
[tool.ruff.format]
quote-style = "double"

View File

@@ -7,6 +7,7 @@ set -euo pipefail
source "${DOTFILES}/config/shared.sh" source "${DOTFILES}/config/shared.sh"
DEST="$HOME/.dotfiles/docs/nvim-keybindings.md" DEST="$HOME/.dotfiles/docs/nvim-keybindings.md"
# Generate Neovim keybindings documentation
main() main()
{ {
msg "Generating Neovim keybindings documentation" msg "Generating Neovim keybindings documentation"
@@ -28,6 +29,7 @@ main()
&& mv "${DEST}.tmp" "$DEST" && mv "${DEST}.tmp" "$DEST"
msg "Neovim keybindings documentation generated at $DEST" msg "Neovim keybindings documentation generated at $DEST"
return 0
} }
main "$@" main "$@"

View File

@@ -6,20 +6,30 @@
source "${DOTFILES}/config/shared.sh" source "${DOTFILES}/config/shared.sh"
DEST="$HOME/.dotfiles/docs/wezterm-keybindings.md" DEST="$HOME/.dotfiles/docs/wezterm-keybindings.md"
# Generate wezterm keybindings documentation
main() main()
{ {
msg "Generating wezterm keybindings documentation" msg "Generating wezterm keybindings documentation"
local tmp
tmp="$(mktemp)"
trap 'rm -f "$tmp"' RETURN
{ {
printf "# wezterm keybindings\n\n" printf "# wezterm keybindings\n\n"
printf "\`\`\`txt\n" printf "\`\`\`txt\n"
} > "$DEST" } > "$tmp"
wezterm show-keys >> "$DEST" if ! wezterm show-keys >> "$tmp"; then
msg "Failed to run 'wezterm show-keys'"
return 1
fi
printf "\`\`\`\n\n- Generated on %s\n" "$(date)" >> "$DEST" printf "\`\`\`\n\n- Generated on %s\n" "$(date)" >> "$tmp"
mv "$tmp" "$DEST"
msg "wezterm keybindings documentation generated at $DEST" msg "wezterm keybindings documentation generated at $DEST"
return 0
} }
main "$@" main "$@"

79
scripts/install-apt-packages.sh Executable file
View File

@@ -0,0 +1,79 @@
#!/usr/bin/env bash
set -euo pipefail
# @description Install essential apt packages for development.
#
# shellcheck source=shared.sh
source "$DOTFILES/config/shared.sh"
msgr run "Starting to install apt packages"
if ! command -v apt &> /dev/null; then
msgr warn "apt not found (not a Debian-based system)"
exit 0
fi
packages=(
# Build essentials
build-essential # gcc, g++, make
cmake # Cross-platform build system
pkg-config # Helper for compiling against libraries
autoconf # Automatic configure script builder
automake # Makefile generator
libtool # Generic library support script
# Libraries for compiling languages
libssl-dev # SSL development headers
libffi-dev # Foreign function interface
zlib1g-dev # Compression library
libreadline-dev # Command-line editing
libbz2-dev # Bzip2 compression
libsqlite3-dev # SQLite database
libncurses-dev # Terminal UI library
# CLI utilities (not in cargo/go/npm)
jq # JSON processor
tmux # Terminal multiplexer
tree # Directory listing
unzip # Archive extraction
shellcheck # Shell script linter
socat # Multipurpose network relay
gnupg # GPG encryption/signing
software-properties-common # add-apt-repository command
)
# Install apt packages that are not already present
install_packages()
{
local to_install=()
for pkg in "${packages[@]}"; do
pkg="${pkg%%#*}"
pkg="${pkg// /}"
[[ -z "$pkg" ]] && continue
if dpkg -s "$pkg" &> /dev/null; then
msgr ok "$pkg already installed"
else
to_install+=("$pkg")
fi
done
if [[ ${#to_install[@]} -gt 0 ]]; then
msgr run "Installing ${#to_install[@]} packages: ${to_install[*]}"
sudo apt update
sudo apt install -y "${to_install[@]}"
else
msgr ok "All packages already installed"
fi
return 0
}
# Install all apt packages and report completion
main()
{
install_packages
msgr yay "apt package installations complete"
return 0
}
main "$@"

View File

@@ -57,6 +57,7 @@ install_packages()
msgr run_done "Done installing $pkg" msgr run_done "Done installing $pkg"
echo "" echo ""
done done
return 0
} }
# Function to perform additional steps for installed cargo packages # Function to perform additional steps for installed cargo packages
@@ -72,13 +73,16 @@ post_install_steps()
msgr run "Removing cargo cache" msgr run "Removing cargo cache"
cargo cache --autoclean cargo cache --autoclean
msgr "done" "Done removing cargo cache" msgr "done" "Done removing cargo cache"
return 0
} }
# Install cargo packages and run post-install steps
main() main()
{ {
install_packages install_packages
msgr "done" "Installed cargo packages!" msgr "done" "Installed cargo packages!"
post_install_steps post_install_steps
return 0
} }
main "$@" main "$@"

View File

@@ -12,6 +12,7 @@ PBB_SYNTAX="syntax: bash"
PBB_TAGS="tags: [bash]" PBB_TAGS="tags: [bash]"
PBB_TEMP_DIR="${XDG_CACHE_HOME:-$HOME/.cache}/cheat/pbb" PBB_TEMP_DIR="${XDG_CACHE_HOME:-$HOME/.cache}/cheat/pbb"
# Verify required tools are installed
check_required_tools() check_required_tools()
{ {
for t in "${PBB_REQUIRED_TOOLS[@]}"; do for t in "${PBB_REQUIRED_TOOLS[@]}"; do
@@ -20,32 +21,37 @@ check_required_tools()
exit 1 exit 1
fi fi
done done
return 0
} }
# Clone or update the pure-bash-bible repository
clone_or_update_repo() clone_or_update_repo()
{ {
if [ ! -d "$PBB_TEMP_DIR/.git" ]; then if [[ ! -d "$PBB_TEMP_DIR/.git" ]]; then
msg_run "Starting to clone $PBB_GIT" msg_run "Starting to clone $PBB_GIT"
git clone --depth 1 --single-branch -q "$PBB_GIT" "$PBB_TEMP_DIR" \ git clone --depth 1 --single-branch -q "$PBB_GIT" "$PBB_TEMP_DIR"
&& msg_yay "Cloned $PBB_GIT" msg_yay "Cloned $PBB_GIT"
else else
msg_run "Starting to update $PBB_GIT" msg_run "Starting to update $PBB_GIT"
git -C "$PBB_TEMP_DIR" reset --hard origin/master git -C "$PBB_TEMP_DIR" reset --hard origin/master
git -C "$PBB_TEMP_DIR" pull -q \ git -C "$PBB_TEMP_DIR" pull -q
&& msgr yay "Updated $PBB_GIT" msg_yay "Updated $PBB_GIT"
fi fi
return 0
} }
# Get the cheat destination directory for pure-bash-bible
prepare_cheat_dest() prepare_cheat_dest()
{ {
local cheat_dest local cheat_dest
cheat_dest="$(cheat -d | grep pure-bash-bible | head -1 | awk '{print $2}')" cheat_dest="$(cheat -d | grep pure-bash-bible | head -1 | awk '{print $2}')"
if [ ! -d "$cheat_dest" ]; then if [[ ! -d "$cheat_dest" ]]; then
mkdir -p "$cheat_dest" mkdir -p "$cheat_dest"
fi fi
echo "$cheat_dest" echo "$cheat_dest"
return 0
} }
# Processes chapter files from the pure-bash-bible repository and generates or updates corresponding cheat sheets. # Processes chapter files from the pure-bash-bible repository and generates or updates corresponding cheat sheets.
@@ -83,19 +89,22 @@ process_chapters()
LC_ALL=C perl -pi.bak -e 's/\<\!-- CHAPTER END --\>//' "$cheat_file" LC_ALL=C perl -pi.bak -e 's/\<\!-- CHAPTER END --\>//' "$cheat_file"
rm "$cheat_file.bak" rm "$cheat_file.bak"
if [ '---' != "$(head -1 < "$cheat_file")" ]; then if [[ '---' != "$(head -1 < "$cheat_file")" ]]; then
local metadata local metadata
metadata="$PBB_SYNTAX\n$PBB_TAGS\n$PBB_SOURCE\n" metadata="$PBB_SYNTAX\n$PBB_TAGS\n$PBB_SOURCE\n"
printf '%s\n%b%s\n%s' "---" "$metadata" "---" "$(cat "$cheat_file")" > "$cheat_file" printf '%s\n%b%s\n%s' "---" "$metadata" "---" "$(cat "$cheat_file")" > "$cheat_file"
fi fi
done done
return 0
} }
# Install pure-bash-bible cheatsheets
main() main()
{ {
check_required_tools check_required_tools
clone_or_update_repo clone_or_update_repo
process_chapters process_chapters
return 0
} }
main "$@" main "$@"

View File

@@ -14,7 +14,7 @@ EXPECTED_CHECKSUM="$(php -r 'copy("https://composer.github.io/installer.sig", "p
php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');" php -r "copy('https://getcomposer.org/installer', 'composer-setup.php');"
ACTUAL_CHECKSUM="$(php -r "echo hash_file('sha384', 'composer-setup.php');")" ACTUAL_CHECKSUM="$(php -r "echo hash_file('sha384', 'composer-setup.php');")"
if [ "$EXPECTED_CHECKSUM" != "$ACTUAL_CHECKSUM" ]; then if [[ "$EXPECTED_CHECKSUM" != "$ACTUAL_CHECKSUM" ]]; then
echo >&2 'ERROR: Invalid installer checksum' echo >&2 'ERROR: Invalid installer checksum'
rm composer-setup.php rm composer-setup.php
exit 1 exit 1
@@ -23,7 +23,7 @@ fi
php composer-setup.php --quiet php composer-setup.php --quiet
RESULT=$? RESULT=$?
rm composer-setup.php rm composer-setup.php
if [ $RESULT -eq 0 ]; then if [[ $RESULT -eq 0 ]]; then
mv composer.phar ~/.local/bin/composer mv composer.phar ~/.local/bin/composer
fi fi
exit $RESULT exit $RESULT

89
scripts/install-dnf-packages.sh Executable file
View File

@@ -0,0 +1,89 @@
#!/usr/bin/env bash
set -euo pipefail
# @description Install essential dnf packages for development.
#
# shellcheck source=shared.sh
source "$DOTFILES/config/shared.sh"
msgr run "Starting to install dnf packages"
if ! command -v dnf &> /dev/null; then
msgr warn "dnf not found (not a Fedora/RHEL-based system)"
exit 0
fi
packages=(
# Build essentials (individual packages, group handled separately)
cmake # Cross-platform build system
pkgconfig # Helper for compiling against libraries
autoconf # Automatic configure script builder
automake # Makefile generator
libtool # Generic library support script
# Libraries for compiling languages
openssl-devel # SSL development headers
libffi-devel # Foreign function interface
zlib-devel # Compression library
readline-devel # Command-line editing
bzip2-devel # Bzip2 compression
sqlite-devel # SQLite database
ncurses-devel # Terminal UI library
# CLI utilities (not in cargo/go/npm)
jq # JSON processor
tmux # Terminal multiplexer
tree # Directory listing
unzip # Archive extraction
ShellCheck # Shell script linter
socat # Multipurpose network relay
gnupg2 # GPG encryption/signing
)
# Install the Development Tools dnf group
install_dev_tools_group()
{
if dnf group list installed 2> /dev/null | grep -q "Development Tools"; then
msgr ok "@development-tools group already installed"
else
msgr run "Installing @development-tools group"
sudo dnf group install -y "Development Tools"
fi
return 0
}
# Install dnf packages that are not already present
install_packages()
{
local to_install=()
for pkg in "${packages[@]}"; do
pkg="${pkg%%#*}"
pkg="${pkg// /}"
[[ -z "$pkg" ]] && continue
if rpm -q "$pkg" &> /dev/null; then
msgr ok "$pkg already installed"
else
to_install+=("$pkg")
fi
done
if [[ ${#to_install[@]} -gt 0 ]]; then
msgr run "Installing ${#to_install[@]} packages: ${to_install[*]}"
sudo dnf install -y "${to_install[@]}"
else
msgr ok "All packages already installed"
fi
return 0
}
# Install all dnf packages and report completion
main()
{
install_dev_tools_group
install_packages
msgr yay "dnf package installations complete"
return 0
}
main "$@"

View File

@@ -18,11 +18,16 @@ fonts=(
# Function to clone or update the NerdFonts repository # Function to clone or update the NerdFonts repository
clone_or_update_repo() clone_or_update_repo()
{ {
if [ ! -d "$TMP_PATH" ]; then if [[ ! -d "$TMP_PATH/.git" ]]; then
rm -rf "$TMP_PATH"
git clone --quiet --filter=blob:none --sparse --depth=1 "$GIT_REPO" "$TMP_PATH" git clone --quiet --filter=blob:none --sparse --depth=1 "$GIT_REPO" "$TMP_PATH"
fi fi
cd "$TMP_PATH" || { msgr err "No such folder $TMP_PATH"; exit 1; } cd "$TMP_PATH" || {
msgr err "No such folder $TMP_PATH"
exit 1
}
return 0
} }
# Function to add fonts to sparse-checkout # Function to add fonts to sparse-checkout
@@ -38,6 +43,7 @@ add_fonts_to_sparse_checkout()
git sparse-checkout add "patched-fonts/$font" git sparse-checkout add "patched-fonts/$font"
echo "" echo ""
done done
return 0
} }
# Function to install NerdFonts # Function to install NerdFonts
@@ -47,19 +53,24 @@ install_fonts()
# shellcheck disable=SC2048,SC2086 # shellcheck disable=SC2048,SC2086
./install.sh -q -s ${fonts[*]} ./install.sh -q -s ${fonts[*]}
msgr run_done "Done" msgr run_done "Done"
return 0
} }
# Remove the temporary nerd-fonts clone directory
remove_tmp_path() remove_tmp_path()
{ {
rm -rf "$TMP_PATH" rm -rf "$TMP_PATH"
return 0
} }
# Clone, sparse-checkout, install fonts, and clean up
main() main()
{ {
clone_or_update_repo clone_or_update_repo
add_fonts_to_sparse_checkout add_fonts_to_sparse_checkout
install_fonts install_fonts
remove_tmp_path remove_tmp_path
return 0
} }
main "$@" main "$@"

View File

@@ -45,12 +45,15 @@ install_extensions()
gh extension install "$ext" gh extension install "$ext"
echo "" echo ""
done done
return 0
} }
# Install all GitHub CLI extensions
main() main()
{ {
install_extensions install_extensions
msgr run_done "Done" msgr run_done "Done"
return 0
} }
main "$@" main "$@"

View File

@@ -15,9 +15,15 @@ if ! command -v git-crypt &> /dev/null; then
BUILD_PATH="$(mktemp -d)" BUILD_PATH="$(mktemp -d)"
trap 'rm -rf "$BUILD_PATH"' EXIT trap 'rm -rf "$BUILD_PATH"' EXIT
if [ ! -f "$CHECK_PATH" ]; then if [[ ! -f "$CHECK_PATH" ]]; then
git clone --depth 1 "$REPO_URL" "$BUILD_PATH" || { msgr err "Failed to clone $REPO_URL"; exit 1; } git clone --depth 1 "$REPO_URL" "$BUILD_PATH" || {
cd "$BUILD_PATH" || { msgr err "$BUILD_PATH not found"; exit 1; } msgr err "Failed to clone $REPO_URL"
exit 1
}
cd "$BUILD_PATH" || {
msgr err "$BUILD_PATH not found"
exit 1
}
make && make install PREFIX="$HOME/.local" make && make install PREFIX="$HOME/.local"
else else
msgr run_done "git-crypt ($CHECK_PATH) already installed" msgr run_done "git-crypt ($CHECK_PATH) already installed"

View File

@@ -33,6 +33,7 @@ install_packages()
go install "$pkg" go install "$pkg"
echo "" echo ""
done done
return 0
} }
# Function to install completions and run actions for selected packages # Function to install completions and run actions for selected packages
@@ -44,6 +45,7 @@ post_install()
git-profile completion zsh > "$ZSH_CUSTOM_COMPLETION_PATH/_git-profile" \ git-profile completion zsh > "$ZSH_CUSTOM_COMPLETION_PATH/_git-profile" \
&& msgr run_done "Installed completions for git-profile" && msgr run_done "Installed completions for git-profile"
fi fi
return 0
} }
# Function to clear go cache # Function to clear go cache
@@ -51,14 +53,17 @@ clear_go_cache()
{ {
msgr run "Clearing go cache" msgr run "Clearing go cache"
go clean -cache -modcache go clean -cache -modcache
return 0
} }
# Install go packages, completions, and clear cache
main() main()
{ {
install_packages install_packages
post_install post_install
clear_go_cache clear_go_cache
msgr run_done "Done" msgr run_done "Done"
return 0
} }
main "$@" main "$@"

View File

@@ -5,7 +5,7 @@ set -uo pipefail
# This script contains large portions from following scripts: # This script contains large portions from following scripts:
# - https://github.com/freekmurze/dotfiles/blob/main/macos/set-defaults.sh # - https://github.com/freekmurze/dotfiles/blob/main/macos/set-defaults.sh
[ "$(uname)" != "Darwin" ] && echo "Not a macOS system" && exit 0 [[ "$(uname)" != "Darwin" ]] && echo "Not a macOS system" && exit 0
# shellcheck source=shared.sh # shellcheck source=shared.sh
source "$DOTFILES/config/shared.sh" source "$DOTFILES/config/shared.sh"

View File

@@ -36,14 +36,16 @@ install_packages()
fi fi
echo "" echo ""
done done
return 0
} }
# Function to upgrade all global npm packages # Function to upgrade all global npm packages
upgrade_global_packages() upgrade_global_packages()
{ {
msgr run "Upgrading all global packages" msgr run "Upgrading all global packages"
npm -g --no-progress --no-timing --no-fund outdated npm -g --no-progress --no-timing --no-fund outdated || true
npm -g --no-timing --no-fund upgrade npm -g --no-timing --no-fund upgrade
return 0
} }
# Function to clean npm cache # Function to clean npm cache
@@ -53,14 +55,17 @@ clean_npm_cache()
npm cache verify npm cache verify
npm cache clean --force npm cache clean --force
npm cache verify npm cache verify
return 0
} }
# Install, upgrade, and clean npm packages
main() main()
{ {
install_packages install_packages
upgrade_global_packages upgrade_global_packages
clean_npm_cache clean_npm_cache
msgr yay "npm package installations complete" msgr yay "npm package installations complete"
return 0
} }
main "$@" main "$@"

View File

@@ -43,15 +43,18 @@ install_ntfy()
mkdir -p ~/.config/ntfy mkdir -p ~/.config/ntfy
# Copy config only if it does not exist # Copy config only if it does not exist
if [ ! -f "$HOME/.config/ntfy/client.yml" ]; then if [[ ! -f "$HOME/.config/ntfy/client.yml" ]]; then
cp "$tmpdir/${NTFY_DIR}/client/client.yml" ~/.config/ntfy/client.yml cp "$tmpdir/${NTFY_DIR}/client/client.yml" ~/.config/ntfy/client.yml
fi fi
return 0
} }
# Download and install ntfy
main() main()
{ {
install_ntfy install_ntfy
msgr "done" "ntfy installation complete" msgr "done" "ntfy installation complete"
return 0
} }
main "$@" main "$@"

View File

@@ -18,6 +18,7 @@ fi
tools=( tools=(
ansible # IT automation and configuration management ansible # IT automation and configuration management
openapi-python-client # Generate Python API clients from OpenAPI specs openapi-python-client # Generate Python API clients from OpenAPI specs
ruff # Fast Python linter and formatter
) )
# Library packages — installed into system Python with `uv pip install --system` # Library packages — installed into system Python with `uv pip install --system`
@@ -40,6 +41,7 @@ install_tools()
uv tool install --upgrade "$pkg" uv tool install --upgrade "$pkg"
echo "" echo ""
done done
return 0
} }
# Function to install library packages via uv pip install # Function to install library packages via uv pip install
@@ -56,6 +58,7 @@ install_libraries()
uv pip install --system --upgrade "$pkg" uv pip install --system --upgrade "$pkg"
echo "" echo ""
done done
return 0
} }
# Function to upgrade all uv-managed tools # Function to upgrade all uv-managed tools
@@ -63,14 +66,17 @@ upgrade_tools()
{ {
msgr run "Upgrading all uv-managed tools" msgr run "Upgrading all uv-managed tools"
uv tool upgrade --all uv tool upgrade --all
return 0
} }
# Install Python tools, libraries, and upgrade all
main() main()
{ {
install_tools install_tools
install_libraries install_libraries
upgrade_tools upgrade_tools
msgr yay "Python package installations complete" msgr yay "Python package installations complete"
return 0
} }
main "$@" main "$@"

View File

@@ -3,9 +3,11 @@ set -euo pipefail
# @description Install XCode CLI Tools with osascript magic. # @description Install XCode CLI Tools with osascript magic.
# Ismo Vuorinen <https://github.com/ivuorinen> 2018 # Ismo Vuorinen <https://github.com/ivuorinen> 2018
# #
# shellcheck source=../config/shared.sh
source "${DOTFILES}/config/shared.sh"
# Check if the script is running on macOS # Check if the script is running on macOS
if [ "$(uname)" != "Darwin" ]; then if [[ "$(uname)" != "Darwin" ]]; then
msgr warn "Not a macOS system" msgr warn "Not a macOS system"
exit 0 exit 0
fi fi
@@ -27,6 +29,7 @@ keep_alive_sudo()
sleep 60 sleep 60
kill -0 "$$" || exit kill -0 "$$" || exit
done 2> /dev/null & done 2> /dev/null &
return 0
} }
XCODE_TOOLS_PATH="$(xcode-select -p)" XCODE_TOOLS_PATH="$(xcode-select -p)"
@@ -40,12 +43,13 @@ prompt_xcode_install()
'tell app "System Events" to display dialog "Please click install when Command Line Developer Tools appears"' 'tell app "System Events" to display dialog "Please click install when Command Line Developer Tools appears"'
)" )"
if [ "$XCODE_MESSAGE" = "button returned:OK" ]; then if [[ "$XCODE_MESSAGE" = "button returned:OK" ]]; then
xcode-select --install xcode-select --install
else else
msgr warn "You have cancelled the installation, please rerun the installer." msgr warn "You have cancelled the installation, please rerun the installer."
exit 1 exit 1
fi fi
return 0
} }
# Main function # Main function
@@ -53,16 +57,17 @@ main()
{ {
keep_alive_sudo keep_alive_sudo
if [ -x "$XCODE_SWIFT_PATH" ]; then if [[ -x "$XCODE_SWIFT_PATH" ]]; then
msgr run "You have swift from xcode-select. Continuing..." msgr run "You have swift from xcode-select. Continuing..."
else else
prompt_xcode_install prompt_xcode_install
fi fi
until [ -f "$XCODE_SWIFT_PATH" ]; do until [[ -f "$XCODE_SWIFT_PATH" ]]; do
echo -n "." echo -n "."
sleep 1 sleep 1
done done
return 0
} }
main "$@" main "$@"

View File

@@ -14,18 +14,20 @@ clone_z_repo()
local git_path=$1 local git_path=$1
local bin_path=$2 local bin_path=$2
if [ ! -d "$bin_path" ]; then if [[ ! -d "$bin_path" ]]; then
git clone "$git_path" "$bin_path" git clone "$git_path" "$bin_path"
msgr run_done "z installed at $bin_path" msgr run_done "z installed at $bin_path"
else else
msgr ok "z ($bin_path/) already installed" msgr ok "z ($bin_path/) already installed"
fi fi
return 0
} }
# Main function # Main function
main() main()
{ {
clone_z_repo "$Z_GIT_PATH" "$Z_BIN_PATH" clone_z_repo "$Z_GIT_PATH" "$Z_BIN_PATH"
return 0
} }
main "$@" main "$@"

View File

@@ -5,7 +5,7 @@
: "${VERBOSE:=0}" : "${VERBOSE:=0}"
# Source the main shared config if not already loaded # Source the main shared config if not already loaded
if [ -z "${SHARED_SCRIPTS_SOURCED:-}" ]; then if [[ -z "${SHARED_SCRIPTS_SOURCED:-}" ]]; then
source "${DOTFILES}/config/shared.sh" source "${DOTFILES}/config/shared.sh"
export SHARED_SCRIPTS_SOURCED=1 export SHARED_SCRIPTS_SOURCED=1
fi fi

View File

@@ -3,7 +3,7 @@
set -euo pipefail set -euo pipefail
if [ -x "node_modules/bats/bin/bats" ]; then if [[ -x "node_modules/bats/bin/bats" ]]; then
git ls-files '*.bats' -z | xargs -0 node_modules/bats/bin/bats git ls-files '*.bats' -z | xargs -0 node_modules/bats/bin/bats
elif command -v npx > /dev/null; then elif command -v npx > /dev/null; then
git ls-files '*.bats' -z | xargs -0 npx --yes bats git ls-files '*.bats' -z | xargs -0 npx --yes bats

View File

@@ -1,11 +1,145 @@
#!/usr/bin/env bats #!/usr/bin/env bats
setup() { setup()
{
export DOTFILES="$PWD" export DOTFILES="$PWD"
} }
# ── Group 1: Main help & routing ──────────────────────────────
@test "dfm help shows usage" { @test "dfm help shows usage" {
run bash local/bin/dfm help run bash local/bin/dfm help
[ "$status" -eq 0 ] [ "$status" -eq 0 ]
[[ "$output" == *"Usage: dfm"* ]] [[ "$output" == *"Usage: dfm"* ]]
} }
@test "dfm with no args shows full usage with all sections" {
run bash local/bin/dfm
[ "$status" -eq 0 ]
[[ "$output" == *"Usage: dfm"* ]]
[[ "$output" == *"dfm install"* ]]
[[ "$output" == *"dfm check"* ]]
[[ "$output" == *"dfm helpers"* ]]
[[ "$output" == *"dfm docs"* ]]
[[ "$output" == *"dfm dotfiles"* ]]
[[ "$output" == *"dfm scripts"* ]]
}
@test "dfm with invalid section shows usage" {
run bash local/bin/dfm nonexistent
[ "$status" -eq 0 ]
[[ "$output" == *"Usage: dfm"* ]]
}
# ── Group 2: Install menu completeness ────────────────────────
@test "dfm install menu shows all entries" {
run bash local/bin/dfm install
[ "$status" -eq 0 ]
[[ "$output" == *"all"* ]]
[[ "$output" == *"apt-packages"* ]]
[[ "$output" == *"cargo"* ]]
[[ "$output" == *"cheat-databases"* ]]
[[ "$output" == *"composer"* ]]
[[ "$output" == *"dnf-packages"* ]]
[[ "$output" == *"fonts"* ]]
[[ "$output" == *"gh"* ]]
[[ "$output" == *"git-crypt"* ]]
[[ "$output" == *"go"* ]]
[[ "$output" == *"imagick"* ]]
[[ "$output" == *"macos"* ]]
[[ "$output" == *"npm-packages"* ]]
[[ "$output" == *"ntfy"* ]]
[[ "$output" == *"nvm-latest"* ]]
[[ "$output" == *"nvm"* ]]
[[ "$output" == *"python-packages"* ]]
[[ "$output" == *"xcode-cli-tools"* ]]
[[ "$output" == *"z"* ]]
}
@test "dfm install with invalid subcommand shows menu" {
run bash local/bin/dfm install nonexistent
[ "$status" -eq 0 ]
[[ "$output" == *"dfm install"* ]]
}
# ── Group 3: Other section menus ──────────────────────────────
@test "dfm helpers menu shows entries" {
run bash local/bin/dfm helpers
[ "$status" -eq 0 ]
[[ "$output" == *"aliases"* ]]
[[ "$output" == *"colors"* ]]
[[ "$output" == *"path"* ]]
[[ "$output" == *"env"* ]]
}
@test "dfm docs menu shows entries" {
run bash local/bin/dfm docs
[ "$status" -eq 0 ]
[[ "$output" == *"all"* ]]
[[ "$output" == *"tmux"* ]]
[[ "$output" == *"nvim"* ]]
[[ "$output" == *"wezterm"* ]]
}
@test "dfm dotfiles menu shows entries" {
run bash local/bin/dfm dotfiles
[ "$status" -eq 0 ]
[[ "$output" == *"fmt"* ]]
[[ "$output" == *"shfmt"* ]]
[[ "$output" == *"yamlfmt"* ]]
}
@test "dfm check menu shows entries" {
run bash local/bin/dfm check
[ "$status" -eq 0 ]
[[ "$output" == *"arch"* ]]
[[ "$output" == *"host"* ]]
}
@test "dfm scripts menu lists install scripts" {
run bash local/bin/dfm scripts
[ "$status" -eq 0 ]
[[ "$output" == *"cargo-packages"* ]]
[[ "$output" == *"fonts"* ]]
[[ "$output" == *"z"* ]]
}
@test "dfm tests menu shows entries" {
run bash local/bin/dfm tests
[ "$status" -eq 0 ]
[[ "$output" == *"msgr"* ]]
[[ "$output" == *"params"* ]]
}
# ── Group 4: Check commands ───────────────────────────────────
@test "dfm check arch returns current arch" {
run bash local/bin/dfm check arch
[ "$status" -eq 0 ]
[ -n "$output" ]
}
@test "dfm check arch with matching value exits 0" {
local current_arch
current_arch="$(uname)"
run bash local/bin/dfm check arch "$current_arch"
[ "$status" -eq 0 ]
}
@test "dfm check arch with non-matching value exits 1" {
run bash local/bin/dfm check arch FakeArch
[ "$status" -eq 1 ]
}
@test "dfm check host returns current hostname" {
run bash local/bin/dfm check host
[ "$status" -eq 0 ]
[ -n "$output" ]
}
@test "dfm check host with non-matching value exits 1" {
run bash local/bin/dfm check host fakehostname
[ "$status" -eq 1 ]
}

221
yarn.lock
View File

@@ -96,6 +96,22 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"@isaacs/balanced-match@npm:^4.0.1":
version: 4.0.1
resolution: "@isaacs/balanced-match@npm:4.0.1"
checksum: 10c0/7da011805b259ec5c955f01cee903da72ad97c5e6f01ca96197267d3f33103d5b2f8a1af192140f3aa64526c593c8d098ae366c2b11f7f17645d12387c2fd420
languageName: node
linkType: hard
"@isaacs/brace-expansion@npm:^5.0.1":
version: 5.0.1
resolution: "@isaacs/brace-expansion@npm:5.0.1"
dependencies:
"@isaacs/balanced-match": "npm:^4.0.1"
checksum: 10c0/e5d67c7bbf1f17b88132a35bc638af306d48acbb72810d48fa6e6edd8ab375854773108e8bf70f021f7ef6a8273455a6d1f0c3b5aa2aff06ce7894049ab77fb8
languageName: node
linkType: hard
"@types/node@npm:^24.0.1": "@types/node@npm:^24.0.1":
version: 24.10.9 version: 24.10.9
resolution: "@types/node@npm:24.10.9" resolution: "@types/node@npm:24.10.9"
@@ -114,6 +130,25 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"debug@npm:^4.3.4":
version: 4.4.3
resolution: "debug@npm:4.4.3"
dependencies:
ms: "npm:^2.1.3"
peerDependenciesMeta:
supports-color:
optional: true
checksum: 10c0/d79136ec6c83ecbefd0f6a5593da6a9c91ec4d7ddc4b54c883d6e71ec9accb5f67a1a5e96d00a328196b5b5c86d365e98d8a3a70856aaf16b4e7b1985e67f5a6
languageName: node
linkType: hard
"deep-is@npm:^0.1.3":
version: 0.1.4
resolution: "deep-is@npm:0.1.4"
checksum: 10c0/7f0ee496e0dff14a573dc6127f14c95061b448b87b995fc96c017ce0a1e66af1675e73f1d6064407975bc4ea6ab679497a29fff7b5b9c4e99cb10797c1ad0b4c
languageName: node
linkType: hard
"editorconfig-checker@npm:^6.1.0": "editorconfig-checker@npm:^6.1.0":
version: 6.1.1 version: 6.1.1
resolution: "editorconfig-checker@npm:6.1.1" resolution: "editorconfig-checker@npm:6.1.1"
@@ -124,6 +159,49 @@ __metadata:
languageName: node languageName: node
linkType: hard linkType: hard
"fast-levenshtein@npm:^2.0.6":
version: 2.0.6
resolution: "fast-levenshtein@npm:2.0.6"
checksum: 10c0/111972b37338bcb88f7d9e2c5907862c280ebf4234433b95bc611e518d192ccb2d38119c4ac86e26b668d75f7f3894f4ff5c4982899afced7ca78633b08287c4
languageName: node
linkType: hard
"find-package-json@npm:^1.2.0":
version: 1.2.0
resolution: "find-package-json@npm:1.2.0"
checksum: 10c0/85d6c97afb9f8f0deb0d344a1c4eb8027347cf4d61666c28d3ac3f913e916684441218682b3dd6f8ad570e5d43c96a7db521f70183d70df559d07e1f99cdc635
languageName: node
linkType: hard
"fs-extra@npm:^11.1.1":
version: 11.3.3
resolution: "fs-extra@npm:11.3.3"
dependencies:
graceful-fs: "npm:^4.2.0"
jsonfile: "npm:^6.0.1"
universalify: "npm:^2.0.0"
checksum: 10c0/984924ff4104e3e9f351b658a864bf3b354b2c90429f57aec0acd12d92c4e6b762cbacacdffb4e745b280adce882e1f980c485d9f02c453f769ab4e7fc646ce3
languageName: node
linkType: hard
"glob@npm:^13.0.0":
version: 13.0.1
resolution: "glob@npm:13.0.1"
dependencies:
minimatch: "npm:^10.1.2"
minipass: "npm:^7.1.2"
path-scurry: "npm:^2.0.0"
checksum: 10c0/af7b863dec8dff74f61d7d6e53104e1f6bbdd482157a196cade8ed857481e876ec35181b38a059b2a7b93ea3b08248f4ff0792fef6dc91814fd5097a716f48e4
languageName: node
linkType: hard
"graceful-fs@npm:^4.1.6, graceful-fs@npm:^4.2.0":
version: 4.2.11
resolution: "graceful-fs@npm:4.2.11"
checksum: 10c0/386d011a553e02bc594ac2ca0bd6d9e4c22d7fa8cfbfc448a6d148c59ea881b092db9dbe3547ae4b88e55f1b01f7c4a2ecc53b310c042793e63aa44cf6c257f2
languageName: node
linkType: hard
"ivuorinen-dotfiles@workspace:.": "ivuorinen-dotfiles@workspace:.":
version: 0.0.0-use.local version: 0.0.0-use.local
resolution: "ivuorinen-dotfiles@workspace:." resolution: "ivuorinen-dotfiles@workspace:."
@@ -132,10 +210,139 @@ __metadata:
"@types/node": "npm:^24.0.1" "@types/node": "npm:^24.0.1"
bats: "npm:^1.12.0" bats: "npm:^1.12.0"
editorconfig-checker: "npm:^6.1.0" editorconfig-checker: "npm:^6.1.0"
markdown-table-formatter: "npm:^1.7.0"
prettier: "npm:^3.8.1"
typescript: "npm:^5.8.3" typescript: "npm:^5.8.3"
languageName: unknown languageName: unknown
linkType: soft linkType: soft
"jsonfile@npm:^6.0.1":
version: 6.2.0
resolution: "jsonfile@npm:6.2.0"
dependencies:
graceful-fs: "npm:^4.1.6"
universalify: "npm:^2.0.0"
dependenciesMeta:
graceful-fs:
optional: true
checksum: 10c0/7f4f43b08d1869ded8a6822213d13ae3b99d651151d77efd1557ced0889c466296a7d9684e397bd126acf5eb2cfcb605808c3e681d0fdccd2fe5a04b47e76c0d
languageName: node
linkType: hard
"levn@npm:^0.4.1":
version: 0.4.1
resolution: "levn@npm:0.4.1"
dependencies:
prelude-ls: "npm:^1.2.1"
type-check: "npm:~0.4.0"
checksum: 10c0/effb03cad7c89dfa5bd4f6989364bfc79994c2042ec5966cb9b95990e2edee5cd8969ddf42616a0373ac49fac1403437deaf6e9050fbbaa3546093a59b9ac94e
languageName: node
linkType: hard
"lru-cache@npm:^11.0.0":
version: 11.2.5
resolution: "lru-cache@npm:11.2.5"
checksum: 10c0/cc98958d25dddf1c8a8cbdc49588bd3b24450e8dfa78f32168fd188a20d4a0331c7406d0f3250c86a46619ee288056fd7a1195e8df56dc8a9592397f4fbd8e1d
languageName: node
linkType: hard
"markdown-table-formatter@npm:^1.7.0":
version: 1.7.0
resolution: "markdown-table-formatter@npm:1.7.0"
dependencies:
debug: "npm:^4.3.4"
find-package-json: "npm:^1.2.0"
fs-extra: "npm:^11.1.1"
glob: "npm:^13.0.0"
markdown-table-prettify: "npm:^3.6.0"
optionator: "npm:^0.9.4"
bin:
markdown-table-formatter: lib/index.js
checksum: 10c0/0f0d5eaec2c3bb9c60328ffbb4652305845def5387f4c87dd6e83559ef793961353af64ae44bce9cda3394469e419e046ae42fe7e9cafd47414b42deaa28f3b7
languageName: node
linkType: hard
"markdown-table-prettify@npm:^3.6.0":
version: 3.7.0
resolution: "markdown-table-prettify@npm:3.7.0"
bin:
markdown-table-prettify: cli/index.js
checksum: 10c0/f387b1ca81ceaa201bda2ce1db8e4d392a4d4ac3d7bb3173c7d9e3d9ca389e31d247eee2ccd2fa30f3132ae2447dc51285fb68636cdaf825633a43a499f41cd6
languageName: node
linkType: hard
"minimatch@npm:^10.1.2":
version: 10.1.2
resolution: "minimatch@npm:10.1.2"
dependencies:
"@isaacs/brace-expansion": "npm:^5.0.1"
checksum: 10c0/0cccef3622201703de6ecf9d772c0be1d5513dcc038ed9feb866c20cf798243e678ac35605dac3f1a054650c28037486713fe9e9a34b184b9097959114daf086
languageName: node
linkType: hard
"minipass@npm:^7.1.2":
version: 7.1.2
resolution: "minipass@npm:7.1.2"
checksum: 10c0/b0fd20bb9fb56e5fa9a8bfac539e8915ae07430a619e4b86ff71f5fc757ef3924b23b2c4230393af1eda647ed3d75739e4e0acb250a6b1eb277cf7f8fe449557
languageName: node
linkType: hard
"ms@npm:^2.1.3":
version: 2.1.3
resolution: "ms@npm:2.1.3"
checksum: 10c0/d924b57e7312b3b63ad21fc5b3dc0af5e78d61a1fc7cfb5457edaf26326bf62be5307cc87ffb6862ef1c2b33b0233cdb5d4f01c4c958cc0d660948b65a287a48
languageName: node
linkType: hard
"optionator@npm:^0.9.4":
version: 0.9.4
resolution: "optionator@npm:0.9.4"
dependencies:
deep-is: "npm:^0.1.3"
fast-levenshtein: "npm:^2.0.6"
levn: "npm:^0.4.1"
prelude-ls: "npm:^1.2.1"
type-check: "npm:^0.4.0"
word-wrap: "npm:^1.2.5"
checksum: 10c0/4afb687a059ee65b61df74dfe87d8d6815cd6883cb8b3d5883a910df72d0f5d029821f37025e4bccf4048873dbdb09acc6d303d27b8f76b1a80dd5a7d5334675
languageName: node
linkType: hard
"path-scurry@npm:^2.0.0":
version: 2.0.1
resolution: "path-scurry@npm:2.0.1"
dependencies:
lru-cache: "npm:^11.0.0"
minipass: "npm:^7.1.2"
checksum: 10c0/2a16ed0e81fbc43513e245aa5763354e25e787dab0d539581a6c3f0f967461a159ed6236b2559de23aa5b88e7dc32b469b6c47568833dd142a4b24b4f5cd2620
languageName: node
linkType: hard
"prelude-ls@npm:^1.2.1":
version: 1.2.1
resolution: "prelude-ls@npm:1.2.1"
checksum: 10c0/b00d617431e7886c520a6f498a2e14c75ec58f6d93ba48c3b639cf241b54232d90daa05d83a9e9b9fef6baa63cb7e1e4602c2372fea5bc169668401eb127d0cd
languageName: node
linkType: hard
"prettier@npm:^3.8.1":
version: 3.8.1
resolution: "prettier@npm:3.8.1"
bin:
prettier: bin/prettier.cjs
checksum: 10c0/33169b594009e48f570471271be7eac7cdcf88a209eed39ac3b8d6d78984039bfa9132f82b7e6ba3b06711f3bfe0222a62a1bfb87c43f50c25a83df1b78a2c42
languageName: node
linkType: hard
"type-check@npm:^0.4.0, type-check@npm:~0.4.0":
version: 0.4.0
resolution: "type-check@npm:0.4.0"
dependencies:
prelude-ls: "npm:^1.2.1"
checksum: 10c0/7b3fd0ed43891e2080bf0c5c504b418fbb3e5c7b9708d3d015037ba2e6323a28152ec163bcb65212741fa5d2022e3075ac3c76440dbd344c9035f818e8ecee58
languageName: node
linkType: hard
"typescript@npm:^5.8.3": "typescript@npm:^5.8.3":
version: 5.9.3 version: 5.9.3
resolution: "typescript@npm:5.9.3" resolution: "typescript@npm:5.9.3"
@@ -162,3 +369,17 @@ __metadata:
checksum: 10c0/3033e2f2b5c9f1504bdc5934646cb54e37ecaca0f9249c983f7b1fc2e87c6d18399ebb05dc7fd5419e02b2e915f734d872a65da2e3eeed1813951c427d33cc9a checksum: 10c0/3033e2f2b5c9f1504bdc5934646cb54e37ecaca0f9249c983f7b1fc2e87c6d18399ebb05dc7fd5419e02b2e915f734d872a65da2e3eeed1813951c427d33cc9a
languageName: node languageName: node
linkType: hard linkType: hard
"universalify@npm:^2.0.0":
version: 2.0.1
resolution: "universalify@npm:2.0.1"
checksum: 10c0/73e8ee3809041ca8b818efb141801a1004e3fc0002727f1531f4de613ea281b494a40909596dae4a042a4fb6cd385af5d4db2e137b1362e0e91384b828effd3a
languageName: node
linkType: hard
"word-wrap@npm:^1.2.5":
version: 1.2.5
resolution: "word-wrap@npm:1.2.5"
checksum: 10c0/e0e4a1ca27599c92a6ca4c32260e8a92e8a44f4ef6ef93f803f8ed823f486e0889fc0b93be4db59c8d51b3064951d25e43d434e95dc8c960cc3a63d65d00ba20
languageName: node
linkType: hard