Compare commits

..

26 Commits

Author SHA1 Message Date
renovate[bot]
d54433cbf9 chore(deps): update image python to v3.14.3
Signed-off-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-02-07 21:28:13 +00:00
765c2fce72 test(dfm): expand bats tests from 1 to 16
Add tests for menu output of all sections (install, helpers, docs,
dotfiles, check, scripts, tests), routing of invalid input, install
menu completeness for all 19 entries, and check arch/host commands.
2026-02-07 23:20:02 +02:00
88eceaf194 fix(dfm): restrict cheat-databases glob to .sh files only
The install-cheat-* glob was matching .md documentation files, causing
errors when bash tried to execute them.
2026-02-07 22:45:08 +02:00
6d72003446 fix(lint): fix all sonarcloud detected issues (#279)
* fix(ci): replace broad permissions with specific scopes in workflows

Replace read-all/write-all with minimum required permission scopes
across all GitHub Actions workflows to follow the principle of least
privilege (SonarCloud rule githubactions:S8234).

* fix(shell): use [[ instead of [ for conditional tests

Replace single brackets with double brackets in bash conditional
expressions across 14 files (28 changes). All scripts use bash
shebangs so [[ is safe everywhere (SonarCloud rule shelldre:S7688).

* fix(shell): add explicit return statements to functions

Add return 0 as the last statement in ~46 shell functions across
17 files that previously relied on implicit return codes
(SonarCloud rule shelldre:S7682).

* fix(shell): assign positional parameters to local variables

Replace direct $1/$2/$3 usage with named local variables in _log(),
msg(), msg_err(), msg_done(), msg_run(), msg_ok(), and array_diff()
(SonarCloud rule shelldre:S7679).

* fix(python): replace dict() constructor with literal

Use {} instead of dict() for empty dictionary initialization
(SonarCloud rule python:S7498).

* fix(shell): fix husky shebang and tolerate npm outdated exit code

* docs(shell): add function docstring comments

* fix(shell): fix heredoc indentation in x-sonarcloud

* feat(python): add ruff linter and formatter configuration

* fix(ci): align megalinter config with biome, ruff, and shfmt settings

* fix(ci): disable black and yaml-prettier in megalinter config

* chore(ci): update ruff-pre-commit to v0.15.0 and fix hook name

* fix(scripts): check for .git dir before skipping clone in install-fonts

* fix(shell): address code review issues in scripts and shared.sh

- Guard wezterm show-keys failure in create-wezterm-keymaps.sh
- Stop masking git failures with return 0 in install-cheat-purebashbible.sh
- Add missing shared.sh source in install-xcode-cli-tools.sh
- Replace exit 1 with return 1 in sourced shared.sh

* fix(scripts): address code review and security findings

- Guard wezterm show-keys failure in create-wezterm-keymaps.sh
- Stop masking git failures with return 0 in install-cheat-purebashbible.sh
- Add missing shared.sh source in install-xcode-cli-tools.sh
- Replace exit 1 with return 1 in sourced shared.sh
- Remove shell=True subprocess calls in x-git-largest-files.py

* style(shell): apply shfmt formatting and add args to pre-commit hook

* fix(python): suppress bandit false positives in x-git-largest-files

* fix(python): add nosemgrep suppression for check_output call

* feat(format): add prettier for YAML formatting

Install prettier, add .prettierrc.json config (200-char width, 2-space
indent, LF endings), .prettierignore, yarn scripts (lint:prettier,
fix:prettier, format:yaml), and pre-commit hook scoped to YAML files.

* style(yaml): apply prettier formatting

* fix(scripts): address remaining code review findings

- Python: use list comprehension to filter empty strings instead of
  slicing off the last element
- create-wezterm-keymaps: write to temp file and mv for atomic updates
- install-xcode-cli-tools: fix shellcheck source directive path

* fix(python): sort imports alphabetically in x-git-largest-files

* fix(lint): disable PYTHON_ISORT in MegaLinter, ruff handles it

* chore(git): add __pycache__ to gitignore

* fix(python): rename ambiguous variable l to line (E741)

* style: remove trailing whitespace and blank lines

* style(fzf): apply shfmt formatting

* style(shell): apply shfmt formatting

* docs(plans): add design documents

* style(docs): add language specifier to fenced code block

* feat(lint): add markdown-table-formatter to dev tooling

Add markdown-table-formatter as a dev dependency with yarn scripts
(lint:md-table, fix:md-table) and a local pre-commit hook to
automatically format markdown tables on commit.
2026-02-07 19:01:02 +02:00
cff3d1dd8a feat(scripts): add x-sonarcloud script for LLM-driven issue analysis
Bridges LLM agents with SonarCloud's REST API to fetch and format
code quality issues as structured markdown with processing instructions.
2026-02-07 13:24:29 +02:00
a47ce85991 chore: remove hammerspoon type generator and types 2026-02-06 09:12:06 +02:00
13dd701eb7 feat(a): improve encryption script with better error handling
- Add dependency check for age and curl with install instructions
- Add --delete flag to remove originals after encryption
- Add -f/--force flag to control overwrite behavior
- Skip already-encrypted .age files during encryption
- Include hidden files (dotglob) when encrypting directories
- Handle empty directories gracefully with nullglob
- Allow flags in any position (proper option parsing)
- Add set -euo pipefail for better error handling
- Update documentation with all features and examples
- Bump version to 1.1.0
2026-02-06 01:51:01 +02:00
cfde007494 fix(shell): clean up rcfiles and remove redundancies
- Remove deprecated GREP_OPTIONS (handled via alias)
- Quote $ZSH_COMPDUMP to prevent word splitting
- Remove duplicate vim alias (nvim alias takes precedence)
- Consolidate completion path to ZSH_CUSTOM_COMPLETION_PATH
- Simplify PATH setup in rcfiles, centralize in exports
- Move LM Studio PATH from rcfiles to exports
- Add clarifying comments for macOS-specific ssh-add
2026-02-06 00:09:03 +02:00
ed4aa2ffe1 feat(scripts): add install-dnf-packages.sh for Fedora/RHEL 2026-02-05 23:46:10 +02:00
bcf11406b6 feat(scripts): add install-apt-packages.sh for Debian/Ubuntu
Install essential developer packages via apt:
- Build tools: build-essential, cmake, pkg-config, autoconf, automake, libtool
- Dev libraries: libssl-dev, libffi-dev, zlib1g-dev, libreadline-dev, etc.
- CLI utilities: jq, tmux, tree, unzip, shellcheck, socat, gnupg

Curated to avoid duplicates with cargo/go installs (ripgrep, fd, fzf, etc.).
Uses batched apt install for efficiency, exits gracefully on non-Debian systems.
2026-02-05 23:42:16 +02:00
443361cddb chore(scripts): remove unused VERBOSE declarations
Remove VERBOSE="${VERBOSE:-0}" from scripts that never reference
$VERBOSE after setting it. The variable is already set in
scripts/shared.sh line 5.
2026-02-05 22:57:20 +02:00
083d30a0c3 fix(scripts): fix shared.sh guard logic and echo -e portability
- shared.sh: simplify guard logic, remove misleading warning message,
  use ${VAR:-} pattern to avoid unbound variable error
- install-cheat-purebashbible.sh: replace echo -e with printf for
  POSIX portability
2026-02-05 22:55:27 +02:00
81190c051a fix(scripts): standardize source paths and quoting
- install-composer.sh: use $DOTFILES instead of $HOME/.dotfiles
- install-macos-defaults.sh: use $DOTFILES, replace which with command -v
- install-xcode-cli-tools.sh: quote command substitution
- create-nvim-keymaps.sh: quote $DEST in nvim redir command
2026-02-05 22:53:04 +02:00
de773ad68f refactor(scripts): add set -euo pipefail to all installer scripts
Add strict error handling to all scripts:
- 13 scripts get `set -euo pipefail`
- install-macos-defaults.sh gets `set -uo pipefail` (without -e) because
  defaults write commands may fail on newer macOS versions
- install-cargo-packages.sh: also add missing source of shared.sh
2026-02-05 22:51:40 +02:00
e8725c4b47 fix(scripts): use safe temp directories and fix composer exit code
- install-ntfy.sh: use mktemp -d with cleanup trap instead of /tmp/ntfy_*
- install-git-crypt.sh: use mktemp -d with cleanup trap instead of /tmp/git-crypt
- install-composer.sh: only move composer.phar if installation succeeded
2026-02-05 22:49:36 +02:00
b8070e2815 fix(scripts): add missing exit after error handlers
Critical bugs where error paths print a message but don't stop execution:
- install-fonts.sh: cd failure now exits properly
- install-ntfy.sh: unsupported OS case now exits with error
- install-git-crypt.sh: git clone and cd failures now exit properly
2026-02-05 22:44:44 +02:00
9de394d8e9 chore(python): add openapi-python-client to uv tools 2026-02-05 22:07:03 +02:00
08de5ea4a6 refactor(submodules): improve old submodule cleanup in add-submodules.sh 2026-02-05 22:07:03 +02:00
0e69b7cb16 refactor: remove dotbot-brew, dotbot-pip submodules and pipx 2026-02-05 22:07:03 +02:00
7c9096d666 fix: standardize shebangs, error handling, and minor issues in x-* scripts
Normalize shebangs to #!/usr/bin/env bash (x-env-list, x-localip).
Use XDG_CONFIG_HOME in x-change-alacritty-theme. Remove unused
VERBOSE variable in x-multi-ping. Add set -euo pipefail to x-when-down
and x-when-up. Add usage header to x-term-colors. Fix notify-call
to notify-send.sh in x-record.
2026-02-05 22:07:03 +02:00
efd9eebc85 fix: resolve critical issues in x-clean-vendordirs, x-foreach, x-ip
x-clean-vendordirs: remove broken msgr dependency (not sourced),
add set -euo pipefail. x-foreach: replace eval on command args with
direct "$@" execution, add usage guard. x-ip: add set -euo pipefail,
curl dependency check, and silent-fail flag.
2026-02-05 22:07:03 +02:00
fc8db1f5b5 refactor(path): consolidate x-path-{append,prepend,remove} as thin wrappers
Add source guard to x-path so its functions can be loaded without
executing the main logic. Rewrite standalone path scripts to source
x-path and call the appropriate function directly, eliminating code
duplication while preserving source-ability for shell integration.
2026-02-05 22:07:03 +02:00
4414e0c3b6 chore: remove unused x-mkd and x-validate-sha256sum.sh scripts
x-mkd's cd-in-subshell cannot work when executed (only sourced) and
is unused in the repo. x-validate-sha256sum.sh duplicates the
functionality of x-sha256sum-matcher.
2026-02-05 22:07:03 +02:00
abb6c9f615 refactor(dfm): clean up portability, dead code, and error handling
Add bash 4.0+ version check with macOS Homebrew bootstrap. Remove
unreachable fish shell detection and source_file function. Fix bugs:
remove dead ntfy menu entry, fix msg/msgr case mismatch in tests,
guard shift calls against empty args, quote $width, fix $"..." locale
string, fix exit 0 on apt error. Replace declare -A with indexed
array in section_scripts. Use early-return guards with msgr warn for
unavailable brew/apt. Replace exit with return in section functions.
2026-02-05 22:07:03 +02:00
57b566704e fix(lint): resolve all lint errors and remove dangling symlinks
Fix 14 editorconfig/biome errors across 6 files: update biome schema
version, replace tabs with spaces, fix continuation indents, and wrap
long lines. Remove dangling OrbStack fish completion symlinks.
2026-02-05 22:07:03 +02:00
renovate[bot]
4510e62070 chore(deps): update ivuorinen/actions action (v2026.01.21 → v2026.02.03) (#278)
Signed-off-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2026-02-05 17:56:56 +00:00
121 changed files with 2793 additions and 19912 deletions

View File

@@ -8,6 +8,10 @@ indent_style = space
insert_final_newline = true
trim_trailing_whitespace = true
[*.py]
indent_size = 4
max_line_length = 120
[*.fish]
max_line_length = 120

6
.github/README.md vendored
View File

@@ -37,7 +37,7 @@ see what interesting stuff you've done with it. Sharing is caring.
### Interesting folders
| Path | Description |
| ------------------- | -------------------------------------------- |
|---------------------|----------------------------------------------|
| `.github` | GitHub Repository configuration files, meta. |
| `hosts/{hostname}/` | Configs that should apply to that host only. |
| `local/bin` | Helper scripts that I've collected or wrote. |
@@ -52,7 +52,7 @@ is processed by Dotbot during installation.
### dotfile folders
| Repo | Destination | Description |
| --------- | ----------- | ------------------------------------------- |
|-----------|-------------|---------------------------------------------|
| `base/` | `.*` | `$HOME` level files. |
| `config/` | `.config/` | Configurations for applications. |
| `local/` | `.local/` | XDG Base folder: `bin`, `share` and `state` |
@@ -86,7 +86,7 @@ The folder structure follows [XDG Base Directory Specification][xdg] where possi
### XDG Variables
| Env | Default | Short description |
| ------------------ | -------------------- | ---------------------------------------------- |
|--------------------|----------------------|------------------------------------------------|
| `$XDG_BIN_HOME` | `$HOME/.local/bin` | Local binaries |
| `$XDG_CONFIG_HOME` | `$HOME/.config` | User-specific configs |
| `$XDG_DATA_HOME` | `$HOME/.local/share` | User-specific data files |

View File

@@ -9,13 +9,15 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: read-all
permissions:
contents: read
jobs:
debug-changelog:
runs-on: ubuntu-latest
permissions: write-all
permissions:
contents: read
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # v6.0.2
@@ -27,7 +29,7 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
config_file: .github/tag-changelog-config.js
- name: 'Echo results'
- name: "Echo results"
id: output-changelog
run: |
echo "${{ steps.changelog.outputs.changes }}"

View File

@@ -11,7 +11,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: read-all
permissions:
contents: read
jobs:
Linter:
@@ -35,4 +36,4 @@ jobs:
token: ${{ secrets.GITHUB_TOKEN }}
- name: Run PR Lint
uses: ivuorinen/actions/pr-lint@f98ae7cd7d0feb1f9d6b01de0addbb11414cfc73 # v2026.01.21
uses: ivuorinen/actions/pr-lint@f371da218e9152e7d29ee39358454e41010c36dc # v2026.02.03

View File

@@ -5,19 +5,21 @@ name: Release Daily State
on:
workflow_dispatch:
schedule:
- cron: '0 21 * * *' # 00:00 at Europe/Helsinki
- cron: "0 21 * * *" # 00:00 at Europe/Helsinki
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: read-all
permissions:
contents: read
jobs:
new-daily-release:
runs-on: ubuntu-latest
permissions: write-all
permissions:
contents: write
outputs:
created: ${{ steps.daily-version.outputs.created }}

View File

@@ -5,14 +5,15 @@ name: Pre-commit autoupdate
on:
schedule:
# At 04:00 on Monday and Thursday.
- cron: '0 4 * * 1,4'
- cron: "0 4 * * 1,4"
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: read-all
permissions:
contents: read
jobs:
auto-update:
@@ -33,6 +34,6 @@ jobs:
with:
token: ${{ secrets.GITHUB_TOKEN }}
branch: update/pre-commit-hooks
title: 'chore: update pre-commit hooks'
commit-message: 'chore: update pre-commit hooks'
title: "chore: update pre-commit hooks"
commit-message: "chore: update pre-commit hooks"
body: Update versions of pre-commit hooks to latest version.

View File

@@ -14,7 +14,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: read-all
permissions:
pull-requests: read
jobs:
semantic-pr:

View File

@@ -11,7 +11,7 @@ on:
- .github/workflows/sync-labels.yml
- .github/labels.yml
schedule:
- cron: '34 5 * * *'
- cron: "34 5 * * *"
workflow_call:
workflow_dispatch:
@@ -19,7 +19,8 @@ concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: read-all
permissions:
contents: read
jobs:
SyncLabels:
@@ -29,4 +30,4 @@ jobs:
issues: write
steps:
- uses: ivuorinen/actions/sync-labels@f98ae7cd7d0feb1f9d6b01de0addbb11414cfc73 # v2026.01.21
- uses: ivuorinen/actions/sync-labels@f371da218e9152e7d29ee39358454e41010c36dc # v2026.02.03

View File

@@ -5,20 +5,22 @@ name: Update submodules
on:
schedule:
# At 04:00 on Monday and Thursday.
- cron: '0 4 * * 1'
- cron: "0 4 * * 1"
workflow_dispatch:
concurrency:
group: ${{ github.workflow }}-${{ github.ref }}
cancel-in-progress: true
permissions: read-all
permissions:
contents: read
jobs:
update-submodules:
runs-on: ubuntu-latest
permissions: write-all
permissions:
contents: write
steps:
- name: Checkout repository

1
.gitignore vendored
View File

@@ -56,5 +56,6 @@ local/man/yabai.1
local/share/fonts/*
lock
node_modules
__pycache__
ssh/local.d/*
config/fish/fish_variables*

12
.gitmodules vendored
View File

@@ -4,11 +4,6 @@
url = https://github.com/anishathalye/dotbot.git
ignore = dirty
[submodule "dotbot-brew"]
path = tools/dotbot-brew
url = https://github.com/wren/dotbot-brew.git
ignore = dirty
[submodule "dotbot-include"]
path = tools/dotbot-include
url = https://gitlab.com/gnfzdz/dotbot-include.git
@@ -29,11 +24,6 @@
url = https://github.com/tmux-plugins/tmux-sessionist.git
ignore = dirty
[submodule "dotbot-pip"]
path = tools/dotbot-pip
url = https://github.com/sobolevn/dotbot-pip.git
ignore = dirty
[submodule "tmux/tmux-suspend"]
path = config/tmux/plugins/tmux-suspend
url = https://github.com/MunifTanjim/tmux-suspend.git
@@ -63,6 +53,8 @@
[submodule "tmux/tmux-resurrect"]
path = config/tmux/plugins/tmux-resurrect
url = https://github.com/tmux-plugins/tmux-resurrect.git
ignore = dirty
[submodule "tmux/catppuccin"]
path = config/tmux/plugins/catppuccin
url = https://github.com/catppuccin/tmux.git
ignore = dirty

View File

@@ -6,6 +6,5 @@ config/tmux/plugins/**
config/vim/plugged/**
node_modules
tools/antidote/**
tools/dotbot-brew/**
tools/dotbot-include/**
tools/dotbot/**

View File

@@ -9,16 +9,21 @@ VALIDATE_ALL_CODEBASE: true
FILEIO_REPORTER: false # Generate file.io report
GITHUB_STATUS_REPORTER: true # Generate GitHub status report
IGNORE_GENERATED_FILES: true # Ignore generated files
JAVASCRIPT_DEFAULT_STYLE: prettier # Default style for JavaScript
PRINT_ALPACA: false # Print Alpaca logo in console
SARIF_REPORTER: true # Generate SARIF report
SHOW_SKIPPED_LINTERS: false # Show skipped linters in MegaLinter log
TYPESCRIPT_DEFAULT_STYLE: prettier # Default style for TypeScript
DISABLE_LINTERS:
- REPOSITORY_DEVSKIM
- JAVASCRIPT_ES # using biome
- JAVASCRIPT_PRETTIER # using biome
- TYPESCRIPT_PRETTIER # using biome
- JSON_PRETTIER # using biome
- PYTHON_BLACK # using ruff
- PYTHON_FLAKE8 # using ruff
- PYTHON_PYLINT # using ruff
- PYTHON_ISORT # using ruff (I rules)
YAML_YAMLLINT_CONFIG_FILE: .yamllint.yml
REPOSITORY_GIT_DIFF_DISABLE_ERRORS: true
BASH_SHFMT_ARGUMENTS: -i 2 -bn -ci -sr -fn
FILTER_REGEX_EXCLUDE: >
(node_modules|tools|config/cheat/cheatsheets/community|config/cheat/cheatsheets/tldr|config/fzf|config/zsh|config/tmux/plugins)

View File

@@ -28,12 +28,25 @@ repos:
entry: yarn biome check --write --files-ignore-unknown=true --no-errors-on-unmatched
language: system
files: \.(js|ts|jsx|tsx|json|md)$
- id: markdown-table-formatter
name: Markdown Table Formatter
entry: yarn markdown-table-formatter
language: system
types: [markdown]
- repo: https://github.com/adrienverge/yamllint
rev: v1.38.0
hooks:
- id: yamllint
- repo: https://github.com/pre-commit/mirrors-prettier
rev: v4.0.0-alpha.8
hooks:
- id: prettier
types_or: [yaml]
additional_dependencies:
- prettier@3.8.1
- repo: https://github.com/shellcheck-py/shellcheck-py
rev: v0.11.0.1
hooks:
@@ -43,6 +56,7 @@ repos:
rev: v3.12.0-2
hooks:
- id: shfmt
args: [-i, "2", -bn, -ci, -sr, -fn, -w]
- repo: https://github.com/rhysd/actionlint
rev: v1.7.10
@@ -60,3 +74,10 @@ repos:
hooks:
- id: fish_syntax
- id: fish_indent
- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.15.0
hooks:
- id: ruff-check
args: [--fix]
- id: ruff-format

18
.prettierignore Normal file
View File

@@ -0,0 +1,18 @@
node_modules
.yarn
.pnp.*
.mypy_cache
Brewfile.lock.json
lazy-lock.json
config/cheat/cheatsheets/community
config/cheat/cheatsheets/tldr
config/fzf
config/nvim
config/op/plugins/used_plugins
config/tmux/plugins
config/vim/plugged
config/zsh
local/bin/antigen.zsh
local/bin/asdf
tools
docs/plans

9
.prettierrc.json Normal file
View File

@@ -0,0 +1,9 @@
{
"$schema": "https://json.schemastore.org/prettierrc",
"printWidth": 200,
"tabWidth": 2,
"useTabs": false,
"endOfLine": "lf",
"singleQuote": false,
"proseWrap": "preserve"
}

View File

@@ -1 +1 @@
3.14.2
3.14.3

View File

@@ -13,11 +13,11 @@ ignore_all_files_in_gitignore: true
# Was previously called `ignored_dirs`, please update your config if you are using that.
# Added (renamed) on 2025-04-07
ignored_paths:
- '*.swp'
- '*.tmp'
- '*.tmp.*'
- '.DS_Store'
- '.git/**'
- "*.swp"
- "*.tmp"
- "*.tmp.*"
- ".DS_Store"
- ".git/**"
- /config/cheat/cheatsheets/community/**
- /config/cheat/cheatsheets/pure-bash-bible/**
- /config/cheat/cheatsheets/tldr/**
@@ -85,6 +85,6 @@ excluded_tools: []
# initial prompt for the project. It will always be given to the LLM upon activating the project
# (contrary to the memories, which are loaded on demand).
initial_prompt: ''
initial_prompt: ""
project_name: '.dotfiles'
project_name: ".dotfiles"

123
CLAUDE.md Normal file
View File

@@ -0,0 +1,123 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code)
when working with code in this repository.
## Repository Overview
Personal dotfiles repository for Ismo Vuorinen.
Uses **Dotbot** (not GNU Stow) to symlink configuration files into place.
The directory layout follows the XDG Base Directory Specification.
## Directory Layout and Linking
| Source | Destination | Notes |
|---------------------|-------------------|-------------------------------------------|
| `base/*` | `~/.*` | Home-level dotfiles (`.` added by Dotbot) |
| `config/*` | `~/.config/` | Application configurations |
| `local/bin/*` | `~/.local/bin/` | Helper scripts and utilities |
| `local/share/*` | `~/.local/share/` | Data files |
| `local/man/**` | `~/.local/man/` | Manual pages |
| `ssh/*` | `~/.ssh/` | SSH configuration (mode 0600) |
| `hosts/<hostname>/` | Overlays | Host-specific overrides |
Installation: `./install` runs Dotbot with `install.conf.yaml`,
then applies `hosts/<hostname>/install.conf.yaml` if it exists.
## Commands
```bash
# Install dependencies (required before lint/test)
yarn install
# Linting
yarn lint # Run biome + prettier + editorconfig-checker
yarn lint:biome # Biome only
yarn lint:ec # EditorConfig checker only
# Formatting
yarn fix:biome # Autofix with biome (JS/TS/JSON/MD)
yarn fix:prettier # Autofix with prettier (YAML)
yarn format # Format with biome
yarn format:yaml # Format YAML files with prettier
# Testing (Bats - Bash Automated Testing System)
yarn test # Run all tests in tests/
# Run a single test file:
./node_modules/.bin/bats tests/dfm.bats
# Shell linting
shellcheck <script> # Lint shell scripts
```
## Pre-commit Hooks
Configured in `.pre-commit-config.yaml`: shellcheck, shfmt, biome,
yamllint, prettier, actionlint, stylua, fish_syntax/fish_indent.
Run `pre-commit run --all-files` to check everything.
## Commit Convention
Semantic Commit messages: `type(scope): summary`
(e.g., `fix(tmux): correct prefix binding`).
Enforced by commitlint extending `@ivuorinen/commitlint-config`.
## Architecture
### Shell Configuration Chain
Both `base/bashrc` and `base/zshrc` source `config/shared.sh`,
which loads:
- `config/exports` — environment variables, XDG dirs, PATH
- `config/alias` — shell aliases
Zsh additionally uses **antidote** (in `tools/antidote/`)
for plugin management and **oh-my-posh** for the prompt.
### dfm — Dotfiles Manager
`local/bin/dfm` is the main management script. Key commands:
- `dfm install all` — install everything (called during `./install`)
- `dfm brew install` / `dfm brew update` — Homebrew management
- `dfm docs all` — regenerate documentation under `docs/`
### Submodules
External dependencies are git submodules (Dotbot, plugins,
tmux plugins, cheatsheets, antidote).
Managed by `add-submodules.sh`. All set to `ignore = dirty`.
Updated automatically via GitHub Actions on a schedule.
### Host-specific Configs
Machine-specific overrides live in `hosts/<hostname>/`
with their own `base/`, `config/`, and `install.conf.yaml`.
These are layered on top of the global config during installation.
## Code Style
- **EditorConfig**: 2-space indent, UTF-8, LF line endings.
See `.editorconfig` for per-filetype overrides
(4-space for PHP/fish, tabs for git config).
- **Shell scripts**: Must have a shebang or
`# shellcheck shell=bash` directive.
Follow shfmt settings in `.editorconfig`
(2-space indent, `binary_next_line`,
`switch_case_indent`, `space_redirects`, `function_next_line`).
- **Lua** (neovim config): Formatted with stylua (`stylua.toml`),
90-char line length.
- **JSON/JS/TS/Markdown**: Formatted with Biome (`biome.json`),
80-char width.
- **YAML**: Formatted with Prettier (`.prettierrc.json`),
validated with yamllint (`.yamllint.yml`).
## ShellCheck Disabled Rules
Defined in `.shellcheckrc`:
SC2039 (POSIX `local`), SC2166 (`-o` in test),
SC2154 (unassigned variables), SC1091 (source following),
SC2174 (mkdir -p -m), SC2016 (single-quote expressions).
## Package Manager
Yarn (v4.12.0) is the package manager. Do not use npm.

View File

@@ -5,12 +5,8 @@ git submodule sync --recursive
# dotbot and plugins
git submodule add --name dotbot \
-f https://github.com/anishathalye/dotbot.git tools/dotbot
git submodule add --name dotbot-brew \
-f https://github.com/wren/dotbot-brew.git tools/dotbot-brew
git submodule add --name dotbot-include \
-f https://gitlab.com/gnfzdz/dotbot-include.git tools/dotbot-include
git submodule add --name dotbot-pip \
-f https://github.com/sobolevn/dotbot-pip.git tools/dotbot-pip
# other repos
git submodule add --name cheat-community \
@@ -46,26 +42,70 @@ done
# Mark certain repositories shallow
git config -f .gitmodules submodule.antidote.shallow true
# remove old submodules
folders=(
"config/tmux/plugins/tpm"
"config/tmux/plugins/tmux"
"config/tmux/plugins/tmux-menus"
"tools/dotbot-crontab"
"tools/dotbot-snap"
"config/tmux/plugins/tmux-window-name"
"config/tmux/plugins/tmux-sensible"
"config/tmux/plugins/tmux-mode-indicator"
"config/tmux/plugins/tmux-yank"
"config/tmux/plugins/tmux-fzf-url"
"config/nvim-kickstart"
"local/bin/asdf"
"local/asdf"
"tools/dotbot-asdf"
# Log a message using msgr if available, else echo
_log()
{
local msg="$1"
if command -v msgr > /dev/null 2>&1; then
msgr run_done "$msg"
else
echo " [ok] $msg"
fi
return 0
}
# Remove a stale git submodule and clean up references
remove_old_submodule()
{
local name="$1" path="$2"
# Remove working tree
if [[ -d "$path" ]]; then
rm -rf "$path"
_log "Removed $path"
fi
# Remove stale git index entry
git rm --cached "$path" 2> /dev/null || true
# Remove .git/config section keyed by path
git config --remove-section "submodule.$path" 2> /dev/null || true
# Skip name-based cleanup if no submodule name provided
[[ -z "$name" ]] && return 0
# Remove .git/config section keyed by name
git config --remove-section "submodule.$name" 2> /dev/null || true
# Remove .git/modules/<name>/ cached repository
if [[ -d ".git/modules/$name" ]]; then
rm -rf ".git/modules/$name"
_log "Removed .git/modules/$name"
fi
}
# remove old submodules (name:path pairs)
old_submodules=(
"tmux/tpm:config/tmux/plugins/tpm"
":config/tmux/plugins/tmux"
"tmux/tmux-menus:config/tmux/plugins/tmux-menus"
"dotbot-crontab:tools/dotbot-crontab"
"dotbot-snap:tools/dotbot-snap"
"tmux/tmux-window-name:config/tmux/plugins/tmux-window-name"
"tmux/tmux-sensible:config/tmux/plugins/tmux-sensible"
"tmux/tmux-mode-indicator:config/tmux/plugins/tmux-mode-indicator"
"tmux/tmux-yank:config/tmux/plugins/tmux-yank"
":config/tmux/plugins/tmux-fzf-url"
"nvim-kickstart:config/nvim-kickstart"
"asdf:local/bin/asdf"
"asdf:local/asdf"
"dotbot-asdf:tools/dotbot-asdf"
"dotbot-pip:tools/dotbot-pip"
"dotbot-brew:tools/dotbot-brew"
)
for folder in "${folders[@]}"; do
[ -d "$folder" ] \
&& rm -rf "$folder" \
&& msgr run_done "Removed old submodule $folder"
for entry in "${old_submodules[@]}"; do
name="${entry%%:*}"
path="${entry#*:}"
remove_old_submodule "$name" "$path"
done

View File

@@ -2,6 +2,7 @@
# shellcheck shell=bash
export DOTFILES="$HOME/.dotfiles"
# Minimal PATH for x-have and utilities; full PATH set in shared.sh/exports
export PATH="$HOME/.local/bin:$DOTFILES/local/bin:$PATH"
export SHARED_SCRIPTS_SOURCED=0
@@ -11,7 +12,7 @@ source "$DOTFILES/config/shared.sh"
[ -f "${DOTFILES}/config/fzf/fzf.bash" ] &&
source "${DOTFILES}/config/fzf/fzf.bash"
# Import ssh keys in keychain
# Import ssh keys in keychain (macOS-specific -A flag, silently fails on Linux)
ssh-add -A 2>/dev/null
x-have antidot && {
@@ -21,6 +22,3 @@ x-have antidot && {
PROMPT_DIRTRIM=3
PROMPT_COMMAND='PS1_CMD1=$(git branch --show-current 2>/dev/null)'
PS1='\[\e[95m\]\u\[\e[0m\]@\[\e[38;5;22;2m\]\h\[\e[0m\] \[\e[38;5;33m\]\w\[\e[0m\] \[\e[92;2m\]${PS1_CMD1}\n\[\e[39m\]➜\[\e[0m\] '
# Added by LM Studio CLI (lms)
export PATH="$PATH:$HOME/.lmstudio/bin"

View File

@@ -2,14 +2,14 @@
-- These globals can be set and accessed:
--
globals = {
"rawrequire",
"rawrequire",
}
--
-- These globals can only be accessed:
--
read_globals = {
"hs",
"ls",
"spoon",
"hs",
"ls",
"spoon",
}

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -7,18 +7,13 @@
autoload -U promptinit; promptinit
export DOTFILES="$HOME/.dotfiles"
LOCAL_SHARE="$HOME/.local/share"
export PATH="$HOME/.local/bin:$DOTFILES/local/bin:$LOCAL_SHARE/nvim/mason/bin:$LOCAL_SHARE/bob/nvim-bin:$LOCAL_SHARE/cargo/bin:/opt/homebrew/bin:/usr/local/bin:$PATH"
# Minimal PATH for x-have and utilities; full PATH set in shared.sh/exports
export PATH="$HOME/.local/bin:$DOTFILES/local/bin:$PATH"
export SHARED_SCRIPTS_SOURCED=0
source "$DOTFILES/config/shared.sh"
# zsh completions directory
[ -z "$ZSH_COMPLETIONS" ] && export ZSH_COMPLETIONS="$XDG_CONFIG_HOME/zsh/completion"
# Add zsh completions to FPATH, compinit will be called later
FPATH="$ZSH_COMPLETIONS:$FPATH"
# zsh completions directory (ZSH_CUSTOM_COMPLETION_PATH set in shared.sh)
ZSH_COMPDUMP="$XDG_CACHE_HOME/zsh/zcompdump-${SHORT_HOST}-${ZSH_VERSION}"
source "$DOTFILES/config/zsh/antidote.zsh"
@@ -37,12 +32,9 @@ source_fzf_config
x-have antidot && eval "$(antidot init)"
autoload -Uz compinit bashcompinit
compinit -d $ZSH_COMPDUMP
compinit -d "$ZSH_COMPDUMP"
bashcompinit
# To customize prompt, run `p10k configure` or edit ~/.p10k.zsh.
export P10K_CONFIG="$DOTFILES/config/zsh/p10k.zsh"
[[ ! -f "$P10K_CONFIG" ]] || source "$P10K_CONFIG"
# Added by LM Studio CLI (lms)
export PATH="$PATH:$HOME/.lmstudio/bin"

View File

@@ -1,5 +1,5 @@
{
"$schema": "https://biomejs.dev/schemas/2.3.1/schema.json",
"$schema": "https://biomejs.dev/schemas/2.3.11/schema.json",
"vcs": {
"enabled": true,
"clientKind": "git",

View File

@@ -7,8 +7,6 @@ x-have eza && {
alias ls="eza -h -s=type --git --icons --group-directories-first"
}
alias vim='vim -u "$XDG_CONFIG_HOME/vim/vimrc"'
# Easier navigation: .., ..., ....
alias ..="cd .."
alias ...="cd ../.."

View File

@@ -93,13 +93,13 @@ expand-main:
# Note that not all layouts respond to this command.
increase-main:
mod: mod1
key: ','
key: ","
# Decrease the number of windows in the main pane.
# Note that not all layouts respond to this command.
decrease-main:
mod: mod1
key: '.'
key: "."
# General purpose command for custom layouts.
# Functionality is layout-dependent.

View File

@@ -150,6 +150,7 @@ commit()
git commit -a -m "$commitMessage"
}
# Run Laravel scheduler in a loop
scheduler()
{
while :; do
@@ -282,7 +283,8 @@ export LESSHISTFILE="$XDG_STATE_HOME"/less/history
export MANPAGER="less -X"
# Always enable colored `grep` output
export GREP_OPTIONS="--color=auto"
# Note: GREP_OPTIONS is deprecated since GNU grep 2.21
# Color is handled via alias in config/alias
# check the window size after each command and, if necessary,
# update the values of LINES and COLUMNS.
@@ -436,6 +438,10 @@ msg "Setting up Wakatime configuration"
export WAKATIME_HOME="$XDG_STATE_HOME/wakatime"
x-dc "$WAKATIME_HOME"
# LM Studio CLI
msg "Setting up LM Studio configuration"
export PATH="$PATH:$HOME/.lmstudio/bin"
# Misc
msg "Setting up miscellaneous configuration"
export ZSHZ_DATA="$XDG_STATE_HOME/z"

View File

@@ -1 +0,0 @@
/Applications/OrbStack.app/Contents/MacOS/../Resources/completions/fish/kubectl.fish

View File

@@ -1 +0,0 @@
/Applications/OrbStack.app/Contents/MacOS/../Resources/completions/fish/orbctl.fish

View File

@@ -7,65 +7,67 @@ To be used with a companion fish function like this:
"""
from __future__ import print_function
import json
import os
import signal
import subprocess
import sys
import traceback
BASH = 'bash'
BASH = "bash"
FISH_READONLY = [
'PWD', 'SHLVL', 'history', 'pipestatus', 'status', 'version',
'FISH_VERSION', 'fish_pid', 'hostname', '_', 'fish_private_mode'
"PWD",
"SHLVL",
"history",
"pipestatus",
"status",
"version",
"FISH_VERSION",
"fish_pid",
"hostname",
"_",
"fish_private_mode",
]
IGNORED = [
'PS1', 'XPC_SERVICE_NAME'
]
IGNORED = ["PS1", "XPC_SERVICE_NAME"]
def ignored(name):
if name == 'PWD': # this is read only, but has special handling
if name == "PWD": # this is read only, but has special handling
return False
# ignore other read only variables
if name in FISH_READONLY:
return True
if name in IGNORED or name.startswith("BASH_FUNC"):
return True
if name.startswith('%'):
return True
return False
return name.startswith("%")
def escape(string):
# use json.dumps to reliably escape quotes and backslashes
return json.dumps(string).replace(r'$', r'\$')
return json.dumps(string).replace(r"$", r"\$")
def escape_identifier(word):
return escape(word.replace('?', '\\?'))
return escape(word.replace("?", "\\?"))
def comment(string):
return '\n'.join(['# ' + line for line in string.split('\n')])
return "\n".join(["# " + line for line in string.split("\n")])
def gen_script():
# Use the following instead of /usr/bin/env to read environment so we can
# deal with multi-line environment variables (and other odd cases).
env_reader = "%s -c 'import os,json; print(json.dumps({k:v for k,v in os.environ.items()}))'" % (sys.executable)
args = [BASH, '-c', env_reader]
env_reader = f"{sys.executable} -c 'import os,json; print(json.dumps({{k:v for k,v in os.environ.items()}}))'"
args = [BASH, "-c", env_reader]
output = subprocess.check_output(args, universal_newlines=True)
old_env = output.strip()
pipe_r, pipe_w = os.pipe()
if sys.version_info >= (3, 4):
os.set_inheritable(pipe_w, True)
command = 'eval $1 && ({}; alias) >&{}'.format(
env_reader,
pipe_w
)
args = [BASH, '-c', command, 'bass', ' '.join(sys.argv[1:])]
os.set_inheritable(pipe_w, True)
command = f"eval $1 && ({env_reader}; alias) >&{pipe_w}"
args = [BASH, "-c", command, "bass", " ".join(sys.argv[1:])]
p = subprocess.Popen(args, universal_newlines=True, close_fds=False)
os.close(pipe_w)
with os.fdopen(pipe_r) as f:
@@ -73,9 +75,7 @@ def gen_script():
alias_str = f.read()
if p.wait() != 0:
raise subprocess.CalledProcessError(
returncode=p.returncode,
cmd=' '.join(sys.argv[1:]),
output=new_env + alias_str
returncode=p.returncode, cmd=" ".join(sys.argv[1:]), output=new_env + alias_str
)
new_env = new_env.strip()
@@ -89,41 +89,41 @@ def gen_script():
continue
v1 = old_env.get(k)
if not v1:
script_lines.append(comment('adding %s=%s' % (k, v)))
script_lines.append(comment(f"adding {k}={v}"))
elif v1 != v:
script_lines.append(comment('updating %s=%s -> %s' % (k, v1, v)))
script_lines.append(comment(f"updating {k}={v1} -> {v}"))
# process special variables
if k == 'PWD':
script_lines.append('cd %s' % escape(v))
if k == "PWD":
script_lines.append(f"cd {escape(v)}")
continue
else:
continue
if k == 'PATH':
value = ' '.join([escape(directory)
for directory in v.split(':')])
if k == "PATH": # noqa: SIM108
value = " ".join([escape(directory) for directory in v.split(":")])
else:
value = escape(v)
script_lines.append('set -g -x %s %s' % (k, value))
script_lines.append(f"set -g -x {k} {value}")
for var in set(old_env.keys()) - set(new_env.keys()):
script_lines.append(comment('removing %s' % var))
script_lines.append('set -e %s' % var)
script_lines.append(comment(f"removing {var}"))
script_lines.append(f"set -e {var}")
script = '\n'.join(script_lines)
script = "\n".join(script_lines)
alias_lines = []
for line in alias_str.splitlines():
_, rest = line.split(None, 1)
k, v = rest.split("=", 1)
alias_lines.append("alias " + escape_identifier(k) + "=" + v)
alias = '\n'.join(alias_lines)
alias = "\n".join(alias_lines)
return script + '\n' + alias
return script + "\n" + alias
script_file = os.fdopen(3, 'w')
script_file = os.fdopen(3, "w")
if not sys.argv[1:]:
print('__bass_usage', file=script_file, end='')
print("__bass_usage", file=script_file, end="")
sys.exit(0)
try:
@@ -131,8 +131,8 @@ try:
except subprocess.CalledProcessError as e:
sys.exit(e.returncode)
except Exception:
print('Bass internal error!', file=sys.stderr)
raise # traceback will output to stderr
print("Bass internal error!", file=sys.stderr)
raise # traceback will output to stderr
except KeyboardInterrupt:
signal.signal(signal.SIGINT, signal.SIG_DFL)
os.kill(os.getpid(), signal.SIGINT)

View File

@@ -8,8 +8,8 @@ function fisher --argument-names cmd --description "A plugin manager for Fish"
echo "fisher, version $fisher_version"
case "" -h --help
echo "Usage: fisher install <plugins...> Install plugins"
echo " fisher remove <plugins...> Remove installed plugins"
echo " fisher uninstall <plugins...> Remove installed plugins (alias)"
echo " fisher remove <plugins...> Remove installed plugins"
echo " fisher uninstall <plugins...> Remove installed plugins (alias)"
echo " fisher update <plugins...> Update installed plugins"
echo " fisher update Update all installed plugins"
echo " fisher list [<regex>] List installed plugins matching regex"
@@ -41,7 +41,7 @@ function fisher --argument-names cmd --description "A plugin manager for Fish"
echo "fisher: \"$fish_plugins\" file not found: \"$cmd\"" >&2 && return 1
end
set arg_plugins $file_plugins
else if test "$cmd" = install && ! set --query old_plugins[1]
else if test "$cmd" = install && ! set --query old_plugins[1]
set --append arg_plugins $file_plugins
end

View File

@@ -58,4 +58,3 @@ fish_pager_color_progress 737994
fish_pager_color_prefix f4b8e4
fish_pager_color_completion c6d0f5
fish_pager_color_description 737994

View File

@@ -58,4 +58,3 @@ fish_pager_color_progress 6e738d
fish_pager_color_prefix f5bde6
fish_pager_color_completion cad3f5
fish_pager_color_description 6e738d

View File

@@ -58,4 +58,3 @@ fish_pager_color_progress 6c7086
fish_pager_color_prefix f5c2e7
fish_pager_color_completion cdd6f4
fish_pager_color_description 6c7086

View File

@@ -13,32 +13,37 @@
if [[ $- =~ i ]]; then
# To use custom commands instead of find, override _fzf_compgen_{path,dir}
if ! declare -f _fzf_compgen_path >/dev/null; then
_fzf_compgen_path() {
if ! declare -f _fzf_compgen_path > /dev/null; then
_fzf_compgen_path()
{
echo "$1"
command find -L "$1" \
-name .git -prune -o -name .hg -prune -o -name .svn -prune -o \( -type d -o -type f -o -type l \) \
-a -not -path "$1" -print 2>/dev/null | sed 's@^\./@@'
-a -not -path "$1" -print 2> /dev/null | sed 's@^\./@@'
}
fi
if ! declare -f _fzf_compgen_dir >/dev/null; then
_fzf_compgen_dir() {
if ! declare -f _fzf_compgen_dir > /dev/null; then
_fzf_compgen_dir()
{
command find -L "$1" \
-name .git -prune -o -name .hg -prune -o -name .svn -prune -o -type d \
-a -not -path "$1" -print 2>/dev/null | sed 's@^\./@@'
-a -not -path "$1" -print 2> /dev/null | sed 's@^\./@@'
}
fi
###########################################################
# To redraw line after fzf closes (printf '\e[5n')
bind '"\e[0n": redraw-current-line' 2>/dev/null
bind '"\e[0n": redraw-current-line' 2> /dev/null
__fzf_comprun() {
__fzf_comprun()
{
if [[ "$(type -t _fzf_comprun 2>&1)" = function ]]; then
_fzf_comprun "$@"
elif [[ -n "${TMUX_PANE-}" ]] && { [[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]; }; then
elif [[ -n "${TMUX_PANE-}" ]] && {
[[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]
}; then
shift
fzf-tmux ${FZF_TMUX_OPTS:--d${FZF_TMUX_HEIGHT:-40%}} -- "$@"
else
@@ -47,7 +52,8 @@ if [[ $- =~ i ]]; then
fi
}
__fzf_orig_completion() {
__fzf_orig_completion()
{
local l comp f cmd
while read -r l; do
if [[ "$l" =~ ^(.*\ -F)\ *([^ ]*).*\ ([^ ]*)$ ]]; then
@@ -63,7 +69,8 @@ if [[ $- =~ i ]]; then
done
}
_fzf_opts_completion() {
_fzf_opts_completion()
{
local cur prev opts
COMPREPLY=()
cur="${COMP_WORDS[COMP_CWORD]}"
@@ -112,18 +119,18 @@ if [[ $- =~ i ]]; then
--sync"
case "${prev}" in
--tiebreak)
COMPREPLY=($(compgen -W "length begin end index" -- "$cur"))
return 0
;;
--color)
COMPREPLY=($(compgen -W "dark light 16 bw" -- "$cur"))
return 0
;;
--history)
COMPREPLY=()
return 0
;;
--tiebreak)
COMPREPLY=($(compgen -W "length begin end index" -- "$cur"))
return 0
;;
--color)
COMPREPLY=($(compgen -W "dark light 16 bw" -- "$cur"))
return 0
;;
--history)
COMPREPLY=()
return 0
;;
esac
if [[ "$cur" =~ ^-|\+ ]]; then
@@ -134,7 +141,8 @@ if [[ $- =~ i ]]; then
return 0
}
_fzf_handle_dynamic_completion() {
_fzf_handle_dynamic_completion()
{
local cmd orig_var orig ret orig_cmd orig_complete
cmd="$1"
shift
@@ -142,15 +150,15 @@ if [[ $- =~ i ]]; then
orig_var="_fzf_orig_completion_$cmd"
orig="${!orig_var-}"
orig="${orig##*#}"
if [[ -n "$orig" ]] && type "$orig" >/dev/null 2>&1; then
if [[ -n "$orig" ]] && type "$orig" > /dev/null 2>&1; then
$orig "$@"
elif [[ -n "${_fzf_completion_loader-}" ]]; then
orig_complete=$(complete -p "$orig_cmd" 2>/dev/null)
orig_complete=$(complete -p "$orig_cmd" 2> /dev/null)
_completion_loader "$@"
ret=$?
# _completion_loader may not have updated completion for the command
if [[ "$(complete -p "$orig_cmd" 2>/dev/null)" != "$orig_complete" ]]; then
__fzf_orig_completion < <(complete -p "$orig_cmd" 2>/dev/null)
if [[ "$(complete -p "$orig_cmd" 2> /dev/null)" != "$orig_complete" ]]; then
__fzf_orig_completion < <(complete -p "$orig_cmd" 2> /dev/null)
if [[ "${__fzf_nospace_commands-}" = *" $orig_cmd "* ]]; then
eval "${orig_complete/ -F / -o nospace -F }"
else
@@ -161,7 +169,8 @@ if [[ $- =~ i ]]; then
fi
}
__fzf_generic_path_completion() {
__fzf_generic_path_completion()
{
local cur base dir leftover matches trigger cmd
cmd="${COMP_WORDS[0]}"
if [[ $cmd == \\* ]]; then
@@ -207,7 +216,8 @@ if [[ $- =~ i ]]; then
fi
}
_fzf_complete() {
_fzf_complete()
{
# Split arguments around --
local args rest str_arg i sep
args=("$@")
@@ -231,7 +241,7 @@ if [[ $- =~ i ]]; then
local cur selected trigger cmd post
post="$(caller 0 | awk '{print $2}')_post"
type -t "$post" >/dev/null 2>&1 || post=cat
type -t "$post" > /dev/null 2>&1 || post=cat
cmd="${COMP_WORDS[0]//[^A-Za-z0-9_=]/_}"
trigger=${FZF_COMPLETION_TRIGGER-'**'}
@@ -253,50 +263,59 @@ if [[ $- =~ i ]]; then
fi
}
_fzf_path_completion() {
_fzf_path_completion()
{
__fzf_generic_path_completion _fzf_compgen_path "-m" "" "$@"
}
# Deprecated. No file only completion.
_fzf_file_completion() {
_fzf_file_completion()
{
_fzf_path_completion "$@"
}
_fzf_dir_completion() {
_fzf_dir_completion()
{
__fzf_generic_path_completion _fzf_compgen_dir "" "/" "$@"
}
_fzf_complete_kill() {
_fzf_complete_kill()
{
_fzf_proc_completion "$@"
}
_fzf_proc_completion() {
_fzf_proc_completion()
{
_fzf_complete -m --header-lines=1 --preview 'echo {}' --preview-window down:3:wrap --min-height 15 -- "$@" < <(
command ps -eo user,pid,ppid,start,time,command 2>/dev/null ||
command ps -eo user,pid,ppid,time,args # For BusyBox
command ps -eo user,pid,ppid,start,time,command 2> /dev/null \
|| command ps -eo user,pid,ppid,time,args # For BusyBox
)
}
_fzf_proc_completion_post() {
_fzf_proc_completion_post()
{
awk '{print $2}'
}
_fzf_host_completion() {
_fzf_host_completion()
{
_fzf_complete +m -- "$@" < <(
command cat <(command tail -n +1 ~/.ssh/config ~/.ssh/config.d/* /etc/ssh/ssh_config 2>/dev/null | command grep -i '^\s*host\(name\)\? ' | awk '{for (i = 2; i <= NF; i++) print $1 " " $i}' | command grep -v '[*?%]') \
command cat <(command tail -n +1 ~/.ssh/config ~/.ssh/config.d/* /etc/ssh/ssh_config 2> /dev/null | command grep -i '^\s*host\(name\)\? ' | awk '{for (i = 2; i <= NF; i++) print $1 " " $i}' | command grep -v '[*?%]') \
<(command grep -oE '^[[a-z0-9.,:-]+' ~/.ssh/known_hosts | tr ',' '\n' | tr -d '[' | awk '{ print $1 " " $1 }') \
<(command grep -v '^\s*\(#\|$\)' /etc/hosts | command grep -Fv '0.0.0.0') |
awk '{if (length($2) > 0) {print $2}}' | sort -u
<(command grep -v '^\s*\(#\|$\)' /etc/hosts | command grep -Fv '0.0.0.0') \
| awk '{if (length($2) > 0) {print $2}}' | sort -u
)
}
_fzf_var_completion() {
_fzf_var_completion()
{
_fzf_complete -m -- "$@" < <(
declare -xp | sed -En 's|^declare [^ ]+ ([^=]+).*|\1|p'
)
}
_fzf_alias_completion() {
_fzf_alias_completion()
{
_fzf_complete -m -- "$@" < <(
alias | sed -En 's|^alias ([^=]+).*|\1|p'
)
@@ -321,13 +340,14 @@ if [[ $- =~ i ]]; then
svn tar unzip zip"
# Preserve existing completion
__fzf_orig_completion < <(complete -p $d_cmds $a_cmds 2>/dev/null)
__fzf_orig_completion < <(complete -p $d_cmds $a_cmds 2> /dev/null)
if type _completion_loader >/dev/null 2>&1; then
if type _completion_loader > /dev/null 2>&1; then
_fzf_completion_loader=1
fi
__fzf_defc() {
__fzf_defc()
{
local cmd func opts orig_var orig def
cmd="$1"
func="$2"
@@ -354,22 +374,23 @@ if [[ $- =~ i ]]; then
unset cmd d_cmds a_cmds
_fzf_setup_completion() {
_fzf_setup_completion()
{
local kind fn cmd
kind=$1
fn=_fzf_${1}_completion
if [[ $# -lt 2 ]] || ! type -t "$fn" >/dev/null; then
if [[ $# -lt 2 ]] || ! type -t "$fn" > /dev/null; then
echo "usage: ${FUNCNAME[0]} path|dir|var|alias|host|proc COMMANDS..."
return 1
fi
shift
__fzf_orig_completion < <(complete -p "$@" 2>/dev/null)
__fzf_orig_completion < <(complete -p "$@" 2> /dev/null)
for cmd in "$@"; do
case "$kind" in
dir) __fzf_defc "$cmd" "$fn" "-o nospace -o dirnames" ;;
var) __fzf_defc "$cmd" "$fn" "-o default -o nospace -v" ;;
alias) __fzf_defc "$cmd" "$fn" "-a" ;;
*) __fzf_defc "$cmd" "$fn" "-o default -o bashdefault" ;;
dir) __fzf_defc "$cmd" "$fn" "-o nospace -o dirnames" ;;
var) __fzf_defc "$cmd" "$fn" "-o default -o nospace -v" ;;
alias) __fzf_defc "$cmd" "$fn" "-a" ;;
*) __fzf_defc "$cmd" "$fn" "-o default -o bashdefault" ;;
esac
done
}

View File

@@ -4,7 +4,7 @@
# Auto-completion
# ---------------
# shellcheck source=completion.bash
[[ $- == *i* ]] && source "$HOME/.dotfiles/config/fzf/completion.bash" 2>/dev/null
[[ $- == *i* ]] && source "$HOME/.dotfiles/config/fzf/completion.bash" 2> /dev/null
# Key bindings
# ------------

View File

@@ -13,7 +13,8 @@
# Key bindings
# ------------
__fzf_select__() {
__fzf_select__()
{
local cmd opts
cmd="${FZF_CTRL_T_COMMAND:-"command find -L . -mindepth 1 \\( -path '*/\\.*' -o -fstype 'sysfs' -o -fstype 'devfs' -o -fstype 'devtmpfs' -o -fstype 'proc' \\) -prune \
-o -type f -print \
@@ -21,27 +22,32 @@ __fzf_select__() {
-o -type l -print 2> /dev/null | cut -b3-"}"
opts="--height ${FZF_TMUX_HEIGHT:-40%} --bind=ctrl-z:ignore --reverse ${FZF_DEFAULT_OPTS-} ${FZF_CTRL_T_OPTS-} -m"
# shellcheck disable=SC2091 # Intentionally execute output of __fzfcmd
eval "$cmd" | FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) "$@" |
while read -r item; do
eval "$cmd" | FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) "$@" \
| while read -r item; do
printf '%q ' "$item" # escape special chars
done
}
if [[ $- =~ i ]]; then
__fzfcmd() {
[[ -n "${TMUX_PANE-}" ]] && { [[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]; } &&
echo "fzf-tmux ${FZF_TMUX_OPTS:--d${FZF_TMUX_HEIGHT:-40%}} -- " || echo "fzf"
__fzfcmd()
{
[[ -n "${TMUX_PANE-}" ]] && {
[[ "${FZF_TMUX:-0}" != 0 ]] || [[ -n "${FZF_TMUX_OPTS-}" ]]
} \
&& echo "fzf-tmux ${FZF_TMUX_OPTS:--d${FZF_TMUX_HEIGHT:-40%}} -- " || echo "fzf"
}
fzf-file-widget() {
fzf-file-widget()
{
local selected
selected="$(__fzf_select__ "$@")"
READLINE_LINE="${READLINE_LINE:0:$READLINE_POINT}$selected${READLINE_LINE:$READLINE_POINT}"
READLINE_POINT=$((READLINE_POINT + ${#selected}))
}
__fzf_cd__() {
__fzf_cd__()
{
local cmd opts dir
cmd="${FZF_ALT_C_COMMAND:-"command find -L . -mindepth 1 \\( -path '*/\\.*' -o -fstype 'sysfs' -o -fstype 'devfs' -o -fstype 'devtmpfs' -o -fstype 'proc' \\) -prune \
-o -type d -print 2> /dev/null | cut -b3-"}"
@@ -53,16 +59,17 @@ if [[ $- =~ i ]]; then
) && printf 'builtin cd -- %q' "$dir"
}
__fzf_history__() {
__fzf_history__()
{
local output opts script
opts="--height ${FZF_TMUX_HEIGHT:-40%} --bind=ctrl-z:ignore ${FZF_DEFAULT_OPTS-} -n2..,.. --scheme=history --bind=ctrl-r:toggle-sort ${FZF_CTRL_R_OPTS-} +m --read0"
script='BEGIN { getc; $/ = "\n\t"; $HISTCOUNT = $ENV{last_hist} + 1 } s/^[ *]//; print $HISTCOUNT - $. . "\t$_" if !$seen{$_}++'
# shellcheck disable=SC2091 # Intentionally execute output of __fzfcmd
output=$(
set +o pipefail
builtin fc -lnr -2147483648 |
last_hist=$(HISTTIMEFORMAT='' builtin history 1) perl -n -l0 -e "$script" |
FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) --query "$READLINE_LINE"
builtin fc -lnr -2147483648 \
| last_hist=$(HISTTIMEFORMAT='' builtin history 1) perl -n -l0 -e "$script" \
| FZF_DEFAULT_OPTS="$opts" $(__fzfcmd) --query "$READLINE_LINE"
) || return
READLINE_LINE=${output#*$'\t'}
if [[ -z "$READLINE_POINT" ]]; then

View File

@@ -52,4 +52,4 @@ keybindings:
prs: []
repoPaths: {}
pager:
diff: ''
diff: ""

View File

@@ -1,3 +1,3 @@
---
git_protocol: https
version: '1'
version: "1"

View File

@@ -260,8 +260,6 @@ brew "php@8.2", link: true
brew "php@8.3"
# Pins GitHub Actions to full hashes and versions
brew "pinact"
# Execute binaries from Python packages in isolated environments
brew "pipx"
# Python version management
brew "pyenv"
# Migrate pip packages from one Python version to another

View File

@@ -1,4 +1,4 @@
#!/bin/env bash
#!/usr/bin/env bash
[ -z "$NVM_DIR" ] && export NVM_DIR="$HOME/.config/nvm"
[ -s "$NVM_DIR/nvm.sh" ] && \. "$NVM_DIR/nvm.sh" # This loads nvm
[[ -z "$NVM_DIR" ]] && export NVM_DIR="$HOME/.config/nvm"
[[ -s "$NVM_DIR/nvm.sh" ]] && \. "$NVM_DIR/nvm.sh" # This loads nvm

View File

@@ -40,7 +40,8 @@ return {
operators = {},
-- miscs = {}, -- Uncomment to turn off hard-coded styles
},
lsp_styles = { -- Handles the style of specific lsp hl groups (see `:h lsp-highlight`).
-- Style of specific lsp hl groups (`:h lsp-highlight`)
lsp_styles = {
virtual_text = {
errors = { 'italic' },
hints = { 'italic' },
@@ -72,7 +73,8 @@ return {
enabled = true,
indentscope_color = '',
},
-- For more plugins integrations please scroll down (https://github.com/catppuccin/nvim#integrations)
-- More integrations:
-- github.com/catppuccin/nvim#integrations
},
}
@@ -115,7 +117,8 @@ return {
{
'dmtrKovalenko/fff.nvim',
build = function()
-- this will download prebuild binary or try to use existing rustup toolchain to build from source
-- Downloads prebuild binary or uses rustup
-- toolchain to build from source
-- (if you are using lazy you can use gb for rebuilding a plugin if needed)
require('fff.download').download_or_build_binary()
end,
@@ -124,7 +127,8 @@ return {
opts = { -- (optional)
debug = {
enabled = true, -- we expect your collaboration at least during the beta
show_scores = true, -- to help us optimize the scoring system, feel free to share your scores!
-- Share scores to help optimize scoring
show_scores = true,
},
},
-- No need to lazy-load with lazy.nvim.

View File

@@ -5,7 +5,7 @@
# shellcheck shell=bash
# Defaults
[ -z "$DOTFILES" ] && export DOTFILES="$HOME/.dotfiles"
[[ -z "$DOTFILES" ]] && export DOTFILES="$HOME/.dotfiles"
DOTFILES_CURRENT_SHELL=$(basename "$SHELL")
export DOTFILES_CURRENT_SHELL
@@ -15,7 +15,7 @@ VERBOSE="${VERBOSE:-0}"
DEBUG="${DEBUG:-0}"
# Enable debugging with DEBUG=1
[ "${DEBUG:-0}" -eq 1 ] && set -x
[[ "${DEBUG:-0}" -eq 1 ]] && set -x
# Detect the current shell
CURRENT_SHELL=$(ps -p $$ -ocomm= | awk -F/ '{print $NF}')
@@ -33,9 +33,10 @@ x-path-prepend()
;;
*)
echo "Unsupported shell: $CURRENT_SHELL"
exit 1
return 1
;;
esac
return 0
}
# Function to set environment variables based on the shell
@@ -52,9 +53,10 @@ x-set-env()
;;
*)
echo "Unsupported shell: $CURRENT_SHELL"
exit 1
return 1
;;
esac
return 0
}
# Explicitly set XDG folders, if not already set
@@ -74,7 +76,7 @@ x-path-prepend "$DOTFILES/local/bin"
x-path-prepend "$XDG_BIN_HOME"
# Custom completion paths
[ -z "$ZSH_CUSTOM_COMPLETION_PATH" ] && export ZSH_CUSTOM_COMPLETION_PATH="$XDG_CONFIG_HOME/zsh/completion"
[[ -z "$ZSH_CUSTOM_COMPLETION_PATH" ]] && export ZSH_CUSTOM_COMPLETION_PATH="$XDG_CONFIG_HOME/zsh/completion"
x-dc "$ZSH_CUSTOM_COMPLETION_PATH"
export FPATH="$ZSH_CUSTOM_COMPLETION_PATH:$FPATH"
@@ -83,7 +85,8 @@ if ! declare -f msg > /dev/null; then
# $1 - message (string)
msg()
{
[ "$VERBOSE" -eq 1 ] && msgr msg "$1"
local message="$1"
[[ "$VERBOSE" -eq 1 ]] && msgr msg "$message"
return 0
}
msg "msg was not defined, defined it now"
@@ -95,7 +98,8 @@ if ! declare -f msg_err > /dev/null; then
# $1 - error message (string)
msg_err()
{
msgr err "$1" >&2
local message="$1"
msgr err "$message" >&2
exit 1
}
fi
@@ -106,7 +110,8 @@ if ! declare -f msg_done > /dev/null; then
# $1 - message (string)
msg_done()
{
msgr "done" "$1"
local message="$1"
msgr "done" "$message"
return 0
}
fi
@@ -117,7 +122,8 @@ if ! declare -f msg_run > /dev/null; then
# $1 - message (string)
msg_run()
{
msgr run "$1"
local message="$1"
msgr run "$message"
return 0
}
fi
@@ -128,7 +134,8 @@ if ! declare -f msg_ok > /dev/null; then
# $1 - message (string)
msg_ok()
{
msgr ok "$1"
local message="$1"
msgr ok "$message"
return 0
}
fi
@@ -143,12 +150,16 @@ if ! declare -f array_diff > /dev/null; then
# Source: https://stackoverflow.com/a/42399479/594940
array_diff()
{
local result_var="$1"
local arr1_name="$2"
local arr2_name="$3"
# shellcheck disable=SC1083,SC2086
eval local ARR1=\(\"\${$2[@]}\"\)
eval local ARR1=\(\"\${${arr1_name}[@]}\"\)
# shellcheck disable=SC1083,SC2086
eval local ARR2=\(\"\${$3[@]}\"\)
eval local ARR2=\(\"\${${arr2_name}[@]}\"\)
local IFS=$'\n'
mapfile -t "$1" < <(comm -23 <(echo "${ARR1[*]}" | sort) <(echo "${ARR2[*]}" | sort))
mapfile -t "$result_var" < <(comm -23 <(echo "${ARR1[*]}" | sort) <(echo "${ARR2[*]}" | sort))
return 0
}
fi

View File

@@ -7,13 +7,13 @@ DEFAULT_NAME="main"
CURRENT_SESSION=$(tmux display-message -p "#{session_name}")
# Check that the session has a name
if [ "$CURRENT_SESSION" = "#{session_name}" ] || [ "$CURRENT_SESSION" = "0" ]; then
if [[ "$CURRENT_SESSION" = "#{session_name}" ]] || [[ "$CURRENT_SESSION" = "0" ]]; then
# Check if the default name is already in use
if tmux has-session -t "$DEFAULT_NAME" 2> /dev/null; then
# Query the user for a new name
echo "Session name '$DEFAULT_NAME' is already in use. Enter a new name:"
read -r NEW_NAME
while tmux has-session -t "$NEW_NAME" 2> /dev/null || [ -z "$NEW_NAME" ]; do
while tmux has-session -t "$NEW_NAME" 2> /dev/null || [[ -z "$NEW_NAME" ]]; do
echo "Name '$NEW_NAME' is invalid or already in use. Enter a new name:"
read -r NEW_NAME
done

View File

@@ -8,12 +8,14 @@
set -euo pipefail
# Fall back to native tmux session picker if sesh is not installed
if ! command -v sesh &>/dev/null; then
if ! command -v sesh &> /dev/null; then
tmux choose-tree -Zs
exit 0
fi
pick_with_gum() {
# Pick a sesh session using gum filter
pick_with_gum()
{
sesh list -i \
| gum filter \
--limit 1 \
@@ -22,6 +24,7 @@ pick_with_gum() {
--placeholder 'Pick a sesh' \
--height 50 \
--prompt='⚡'
return 0
}
FZF_COMMON_OPTS=(
@@ -40,15 +43,23 @@ FZF_COMMON_OPTS=(
--preview 'sesh preview {}'
)
pick_with_fzf_tmux() {
# Pick a sesh session using fzf-tmux popup
pick_with_fzf_tmux()
{
sesh list --icons | fzf-tmux -p 80%,70% "${FZF_COMMON_OPTS[@]}"
return 0
}
pick_with_fzf() {
# Pick a sesh session using fzf inline
pick_with_fzf()
{
sesh list --icons | fzf "${FZF_COMMON_OPTS[@]}"
return 0
}
pick_with_select() {
# Pick a sesh session using bash select menu
pick_with_select()
{
local sessions
mapfile -t sessions < <(sesh list)
if [[ ${#sessions[@]} -eq 0 ]]; then
@@ -64,11 +75,11 @@ pick_with_select() {
}
# Cascading tool detection
if command -v gum &>/dev/null; then
if command -v gum &> /dev/null; then
selection=$(pick_with_gum)
elif command -v fzf-tmux &>/dev/null; then
elif command -v fzf-tmux &> /dev/null; then
selection=$(pick_with_fzf_tmux)
elif command -v fzf &>/dev/null; then
elif command -v fzf &> /dev/null; then
selection=$(pick_with_fzf)
else
selection=$(pick_with_select)

View File

@@ -0,0 +1,40 @@
# Skip Already-Installed Cargo Packages
## Problem
`install-cargo-packages.sh` runs `cargo install-update -a` to update installed
packages, then runs `cargo install` for every package in the list — including
ones that are already installed and up-to-date. This wastes time rebuilding
packages that don't need it.
## Solution
Capture the `cargo install-update -a` output, parse installed package names,
and skip `cargo install` for any package that appeared in the update output.
## Changes
**File:** `scripts/install-cargo-packages.sh`
1. Declare an associative array `installed_packages` at the top.
2. In the `cargo-install-update` section, capture output with `tee /dev/stderr`
so it displays in real-time while also being stored in a variable.
3. Parse the captured output with `awk` — extract the first column from lines
matching a version pattern (`v[0-9]+\.[0-9]+`), skipping the header.
4. Populate `installed_packages` associative array from parsed names.
5. In `install_packages()`, check each package against the array. If found, log
a skip message via `msgr` and continue. If not found, install as before.
6. If `cargo-install-update` is not available, the array stays empty and all
packages install normally (preserves existing behavior).
## Output Parsing
The `cargo install-update -a` output format:
```text
Package Installed Latest Needs update
zoxide v0.9.8 v0.9.9 Yes
bkt v0.8.2 v0.8.2 No
```
Extraction: `awk '/v[0-9]+\.[0-9]+/ { print $1 }'` gets package names.

View File

@@ -0,0 +1,55 @@
# dfm Cleanup Design
## Summary
Clean up `local/bin/dfm` to fix bugs, remove dead code, improve
cross-platform portability, and make error handling consistent.
## Changes
### 1. Bash Version Bootstrap
Add a check at the top of the script (after variable declarations)
that requires bash 4.0+. On macOS, if bash is too old, install
Homebrew (if missing) and bash, then print instructions and exit.
The check itself uses only bash 3.2-compatible syntax.
### 2. Remove Fish Dead Code
Remove `CURRENT_SHELL` detection, `source_file()` function, and all
fish branches. Replace `source_file` calls with direct `source`.
The script has a bash shebang — fish handling was unreachable.
### 3. Bug Fixes
- Remove `ntfy` from install menu (no install script exists)
- Fix `msg)``msgr)` case label in `section_tests`
- Guard all `shift` calls against empty argument lists
- Quote `$width` in `menu_builder` seq calls
- Fix `$"..."` locale string → `"..."` in `usage()`
- Fix `exit 0` on apt.txt error → `return 1`
### 4. Replace `declare -A` in `section_scripts`
Replace associative array with indexed `"name:desc"` array,
matching the pattern used everywhere else in the script.
Move `get_script_description()` to top-level (out of the function).
### 5. Early-Return Guards & exit → return
- `section_brew()`: Early return with `msgr warn` if brew unavailable.
Remove duplicate `! x-have brew` check.
- `section_apt()`: Same pattern for apt.
- `section_check()`: Replace `exit` with `return`.
- `section_apt() install`: Replace `exit` with `return`.
- `section_brew() untracked`: Replace `exit` with `return`.
## Files Changed
- `local/bin/dfm` (all changes)
## Verification
- `yarn test` (existing bats test)
- `shellcheck local/bin/dfm`
- `bash -n local/bin/dfm` (syntax check)

View File

@@ -0,0 +1,46 @@
# x-* Scripts Cleanup Design
## Summary
Comprehensive cleanup of all 34 x-* utility scripts in `local/bin/`.
Fix critical bugs, consolidate duplicates, standardize patterns.
## Changes
### Removals
- `x-mkd`, `x-mkd.md`, `tests/x-mkd.bats` — unused, cd-in-subshell broken
- `x-validate-sha256sum.sh`, `x-validate-sha256sum.sh.md` — duplicates x-sha256sum-matcher
### Thin Wrappers (delegate to x-path)
- `x-path-append` → calls `x-path append "$@"`
- `x-path-prepend` → calls `x-path prepend "$@"`
- `x-path-remove` → calls `x-path remove "$@"`
### Critical Fixes
- `x-clean-vendordirs`: call msgr as command (it's in PATH)
- `x-foreach`: replace eval with direct "$@" execution
- `x-ip`: add error handling, curl check
### Consistency Fixes
- Fix `#!/bin/bash``#!/usr/bin/env bash` (x-env-list, x-localip)
- POSIX scripts keep `#!/bin/sh`
- Add `set -euo pipefail` where missing in bash scripts
- Use XDG variables instead of hardcoded paths (x-change-alacritty-theme)
- Quote unquoted variables
### Minor Fixes
- `x-multi-ping`: remove unused VERBOSE variable
- `x-when-down`, `x-when-up`: add error handling
- `x-term-colors`: add usage message
- `x-record`: fix undefined notify-call reference
## Verification
- `yarn test` — ensure remaining tests pass
- `shellcheck` on modified scripts
- `bash -n` syntax check on all modified bash scripts

View File

@@ -1,5 +1,5 @@
---
- include: 'tools/dotbot-defaults.yaml'
- include: "tools/dotbot-defaults.yaml"
- shell:
- echo "Configuring air"
- link:
@@ -7,7 +7,7 @@
force: true
glob: true
path: hosts/air/base/**
prefix: '.'
prefix: "."
~/.config/:
glob: true
force: true

View File

@@ -1,5 +1,5 @@
---
- include: 'tools/dotbot-defaults.yaml'
- include: "tools/dotbot-defaults.yaml"
- shell:
- echo "Configuring lakka"
- link:
@@ -7,7 +7,7 @@
force: true
glob: true
path: hosts/lakka/base/**
prefix: '.'
prefix: "."
~/.config/:
glob: true
force: true

View File

@@ -1,5 +1,5 @@
---
- include: 'tools/dotbot-defaults.yaml'
- include: "tools/dotbot-defaults.yaml"
- shell:
- echo "Configuring s"
- link:
@@ -7,7 +7,7 @@
force: true
glob: true
path: hosts/s/base/**
prefix: '.'
prefix: "."
~/.config/:
glob: true
force: true

View File

@@ -1,5 +1,5 @@
---
- include: 'tools/dotbot-defaults.yaml'
- include: "tools/dotbot-defaults.yaml"
- shell:
- echo "Configuring tunkki"
- link:
@@ -7,7 +7,7 @@
force: true
glob: true
path: hosts/tunkki/base/**
prefix: '.'
prefix: "."
~/.config/:
glob: true
force: true

12
install
View File

@@ -15,24 +15,18 @@ git submodule update --init --recursive "${DOTBOT_DIR}"
"${DOTBOT_BIN_PATH}" \
-d "${BASEDIR}" \
--plugin-dir=tools/dotbot-asdf \
--plugin-dir=tools/dotbot-brew \
--plugin-dir=tools/dotbot-include \
--plugin-dir=tools/dotbot-pip \
-c "${CONFIG}" \
"${@}"
if [ "${DOTBOT_HOST}" != "" ]; then
DOTBOT_HOST_CONFIG="${BASEDIR}/hosts/${DOTBOT_HOST}/${CONFIG}"
echo "-> Trying if host config can be found: ${DOTBOT_HOST_CONFIG}"
[ -r "$DOTBOT_HOST_CONFIG" ] && [ -f "$DOTBOT_HOST_CONFIG" ] &&
echo "(!) Found $DOTBOT_HOST_CONFIG" &&
"$DOTBOT_BIN_PATH" \
[ -r "$DOTBOT_HOST_CONFIG" ] && [ -f "$DOTBOT_HOST_CONFIG" ] \
&& echo "(!) Found $DOTBOT_HOST_CONFIG" \
&& "$DOTBOT_BIN_PATH" \
-d "$BASEDIR" \
--plugin-dir=tools/dotbot-asdf \
--plugin-dir=tools/dotbot-brew \
--plugin-dir=tools/dotbot-include \
--plugin-dir=tools/dotbot-pip \
-c "$DOTBOT_HOST_CONFIG" \
"${@}"
fi

View File

@@ -1,5 +1,5 @@
---
- include: 'tools/dotbot-defaults.yaml'
- include: "tools/dotbot-defaults.yaml"
- clean:
~/:
@@ -34,7 +34,7 @@
force: true
glob: true
path: base/*
prefix: '.'
prefix: "."
# Most of the configs
~/.config/:
glob: true
@@ -78,8 +78,3 @@
- shell:
# Use my dotfiles manager to install everything
- bash local/bin/dfm install all
- pipx:
file: tools/requirements-pipx.txt
stdout: true
stderr: true

View File

@@ -20,7 +20,7 @@ Some problematic code has been fixed per `shellcheck` suggestions.
## Sourced
| Script | Source |
| ----------------------- | ----------------- |
|-------------------------|-------------------|
| `x-dupes` | skx/sysadmin-util |
| `x-foreach` | mvdan/dotfiles |
| `x-multi-ping` | skx/sysadmin-util |

View File

@@ -1,7 +1,9 @@
#!/usr/bin/env bash
# A script for encrypting and decrypting files or directories with age and SSH keys
VERSION="1.0.0"
set -euo pipefail
VERSION="1.1.0"
# Default ENV values
KEYS_FILE="${AGE_KEYSFILE:-$HOME/.ssh/keys.txt}"
@@ -9,14 +11,49 @@ KEYS_SOURCE="${AGE_KEYSSOURCE:-https://github.com/ivuorinen.keys}"
LOG_FILE="${AGE_LOGFILE:-$HOME/.cache/a.log}"
VERBOSE=false
DELETE_ORIGINAL=false
FORCE=false
# Parse flags for verbosity
for arg in "$@"; do
if [[ "$arg" == "-v" || "$arg" == "--verbose" ]]; then
VERBOSE=true
break
# Check for required dependencies
check_dependencies()
{
if ! command -v age &> /dev/null; then
echo "Error: 'age' is not installed. Please install it first." >&2
echo " brew install age # macOS" >&2
echo " apt install age # Debian/Ubuntu" >&2
echo " dnf install age # Fedora" >&2
exit 1
fi
done
if ! command -v curl &> /dev/null; then
echo "Error: 'curl' is not installed." >&2
exit 1
fi
}
# Parse flags
parse_flags()
{
local args=()
for arg in "$@"; do
case "$arg" in
-v | --verbose)
VERBOSE=true
;;
--delete)
DELETE_ORIGINAL=true
;;
-f | --force)
FORCE=true
;;
*)
args+=("$arg")
;;
esac
done
# Return remaining arguments
printf '%s\n' "${args[@]}"
}
# Ensure log directory and file exist with correct permissions
prepare_log_file()
@@ -38,8 +75,6 @@ prepare_log_file()
chmod 0600 "$LOG_FILE"
}
prepare_log_file
# Logging function
log_message()
{
@@ -56,7 +91,7 @@ log_message()
print_help()
{
cat << EOF
Usage: a [command] [file_or_directory] [options]
Usage: a [options] [command] [file_or_directory]
Commands:
e, enc, encrypt Encrypt the specified file or directory
@@ -65,12 +100,14 @@ Commands:
version, --version Show version information
Options:
-v, --verbose Print log messages to console in addition to writing to log file
-v, --verbose Print log messages to console
--delete Delete original files after successful encryption
-f, --force Overwrite existing output files without prompting
Environment Variables:
AGE_KEYSFILE Path to the SSH keys file (default: $HOME/.ssh/keys.txt)
AGE_KEYSFILE Path to the SSH keys file (default: \$HOME/.ssh/keys.txt)
AGE_KEYSSOURCE URL to fetch SSH keys if keys file does not exist
AGE_LOGFILE Path to the log file (default: $HOME/.cache/a.log)
AGE_LOGFILE Path to the log file (default: \$HOME/.cache/a.log)
Examples:
Encrypt a file:
@@ -79,14 +116,21 @@ Examples:
Encrypt a directory:
a e /path/to/directory
Encrypt and delete originals:
a --delete e file.txt
Decrypt a file:
a d file.txt.age
Force overwrite existing files:
a -f e file.txt
Specify a custom keys file:
AGE_KEYSFILE=/path/to/keys.txt a e file.txt
Specify a custom keys source and log file:
AGE_KEYSSOURCE=https://example.com/keys.txt AGE_LOGFILE=/tmp/a.log a d file.txt.age
Requirements:
- age (encryption tool): https://github.com/FiloSottile/age
- curl (for fetching keys)
EOF
}
@@ -115,26 +159,104 @@ fetch_keys_if_missing()
fi
}
# Function to encrypt a single file
encrypt_single_file()
{
local file="$1"
# Skip already encrypted files
if [[ "$file" == *.age ]]; then
log_message "Skipping already encrypted file: $file"
return 0
fi
local output_file="${file}.age"
# Check if output file exists
if [[ -f "$output_file" && "$FORCE" != true ]]; then
log_message "Error: Output file '$output_file' already exists. Use --force to overwrite."
return 1
fi
fetch_keys_if_missing
local temp_file
temp_file="$(mktemp -p "$(dirname "$file")")"
if age -R "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File encrypted successfully: $output_file"
if [[ "$DELETE_ORIGINAL" == true ]]; then
rm -f "$file"
log_message "Original file deleted: $file"
fi
else
rm -f "$temp_file"
log_message "Error: Failed to encrypt file '$file'."
return 1
fi
}
# Function to encrypt files or directories
encrypt_file_or_directory()
{
local file="$1"
if [[ -d "$file" ]]; then
for f in "$file"/*; do
# Enable dotglob to include hidden files
shopt -s dotglob nullglob
local files=("$file"/*)
shopt -u dotglob nullglob
if [[ ${#files[@]} -eq 0 ]]; then
log_message "Warning: Directory '$file' is empty."
return 0
fi
for f in "${files[@]}"; do
encrypt_file_or_directory "$f"
done
elif [[ -f "$file" ]]; then
fetch_keys_if_missing
local output_file="${file}.age"
local temp_file
temp_file="$(mktemp -p "$(dirname "$file")")"
if age -R "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File encrypted successfully: $output_file"
else
rm -f "$temp_file"
log_message "Error: Failed to encrypt file '$file'."
exit 1
encrypt_single_file "$file"
else
log_message "Warning: '$file' is not a file or directory, skipping."
fi
}
# Function to decrypt a single file
decrypt_single_file()
{
local file="$1"
if [[ ! "$file" == *.age ]]; then
log_message "Skipping non-.age file: $file"
return 0
fi
local output_file="${file%.age}"
# Check if output file exists
if [[ -f "$output_file" && "$FORCE" != true ]]; then
log_message "Error: Output file '$output_file' already exists. Use --force to overwrite."
return 1
fi
fetch_keys_if_missing
local temp_file
temp_file="$(mktemp -p "$(dirname "$file")")"
if age -d -i "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File decrypted successfully: $output_file"
if [[ "$DELETE_ORIGINAL" == true ]]; then
rm -f "$file"
log_message "Encrypted file deleted: $file"
fi
else
rm -f "$temp_file"
log_message "Error: Failed to decrypt file '$file'."
return 1
fi
}
@@ -142,54 +264,76 @@ encrypt_file_or_directory()
decrypt_file_or_directory()
{
local file="$1"
if [[ -d "$file" ]]; then
for f in "$file"/*.age; do
decrypt_file_or_directory "$f"
# Enable nullglob to handle no matches gracefully
shopt -s nullglob
local files=("$file"/*.age)
shopt -u nullglob
if [[ ${#files[@]} -eq 0 ]]; then
log_message "Warning: No .age files found in directory '$file'."
return 0
fi
for f in "${files[@]}"; do
decrypt_single_file "$f"
done
elif [[ -f "$file" ]]; then
fetch_keys_if_missing
local output_file="${file%.age}"
local temp_file
temp_file="$(mktemp -p "$(dirname "$file")")"
if age -d -i "$KEYS_FILE" "$file" > "$temp_file" && mv "$temp_file" "$output_file"; then
log_message "File decrypted successfully: $output_file"
else
rm -f "$temp_file"
log_message "Error: Failed to decrypt file '$file'."
exit 1
fi
decrypt_single_file "$file"
else
log_message "Warning: '$file' is not a file or directory, skipping."
fi
}
# Main logic
case "$1" in
e | enc | encrypt)
if [[ $# -lt 2 ]]; then
log_message "Error: No file or directory specified for encryption."
# Main entry point
main()
{
check_dependencies
# Parse flags and get remaining arguments
mapfile -t ARGS < <(parse_flags "$@")
prepare_log_file
local command="${ARGS[0]:-}"
local target="${ARGS[1]:-}"
case "$command" in
e | enc | encrypt)
if [[ -z "$target" ]]; then
log_message "Error: No file or directory specified for encryption."
print_help
exit 1
fi
encrypt_file_or_directory "$target"
;;
d | dec | decrypt)
if [[ -z "$target" ]]; then
log_message "Error: No file or directory specified for decryption."
print_help
exit 1
fi
decrypt_file_or_directory "$target"
;;
help | --help | -h)
print_help
;;
version | --version)
print_version
;;
"")
print_help
exit 1
fi
encrypt_file_or_directory "$2"
;;
d | dec | decrypt)
if [[ $# -lt 2 ]]; then
log_message "Error: No file or directory specified for decryption."
;;
*)
log_message "Error: Unknown command '$command'"
print_help
exit 1
fi
decrypt_file_or_directory "$2"
;;
help | --help)
print_help
;;
version | --version)
print_version
;;
*)
log_message "Error: Unknown command '$1'"
print_help
exit 1
;;
esac
;;
esac
}
main "$@"
# vim: ft=bash:syn=sh:ts=2:sw=2:et:ai:nowrap

View File

@@ -2,28 +2,76 @@
Encrypt or decrypt files and directories using `age` and your GitHub SSH keys.
## Requirements
- [age](https://github.com/FiloSottile/age) - encryption tool
- curl - for fetching SSH keys
Install age:
```bash
brew install age # macOS
apt install age # Debian/Ubuntu
dnf install age # Fedora
```
## Usage
```bash
a encrypt <file|dir>
a decrypt <file.age|dir>
a [options] <command> <file|directory>
```
Commands:
- `e`, `enc`, `encrypt` - encrypt files
- `d`, `dec`, `decrypt` - decrypt files
- `help`, `--help`, `-h` - show help
- `version`, `--version` - show version
Options:
- `-v`, `--verbose` show log output
- `-v`, `--verbose` - show log output
- `--delete` - delete original files after successful operation
- `-f`, `--force` - overwrite existing output files
Environment variables:
- `AGE_KEYSFILE` location of the keys file
- `AGE_KEYSSOURCE` URL to fetch keys if missing
- `AGE_LOGFILE` log file path
- `AGE_KEYSFILE` - location of the keys file (default: `~/.ssh/keys.txt`)
- `AGE_KEYSSOURCE` - URL to fetch keys if missing (default: GitHub keys)
- `AGE_LOGFILE` - log file path (default: `~/.cache/a.log`)
## Example
## Examples
```bash
# Encrypt a file
a encrypt secret.txt
# Encrypt with short command
a e secret.txt
# Decrypt a file
a decrypt secret.txt.age
a d secret.txt.age
# Encrypt a directory (includes hidden files)
a e /path/to/secrets/
# Encrypt and delete originals
a --delete e secret.txt
# Force overwrite existing .age file
a -f e secret.txt
# Verbose output
a -v e secret.txt
```
## Behavior
- Encrypting a directory processes all files recursively, including hidden files
- Already encrypted files (`.age`) are skipped during encryption
- Only `.age` files are processed during directory decryption
- Original files are preserved by default (use `--delete` to remove them)
- Output files are not overwritten by default (use `--force` to overwrite)
<!-- vim: set ft=markdown spell spelllang=en_us cc=80 : -->

View File

@@ -15,38 +15,37 @@
SCRIPT=$(basename "$0")
# Detect the current shell
CURRENT_SHELL=$(ps -p $$ -ocomm= | awk -F/ '{print $NF}')
# Require bash 4.0+ for associative arrays and mapfile
if ((BASH_VERSINFO[0] < 4)); then
echo "dfm requires bash 4.0+, found ${BASH_VERSION}"
if [[ "$(uname)" == "Darwin" ]]; then
if ! command -v brew &> /dev/null; then
echo "Installing Homebrew..."
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
fi
echo "Installing modern bash via Homebrew..."
brew install bash
echo "Done. Restart your shell and run dfm again."
else
echo "Install bash 4.0+ and try again."
fi
exit 1
fi
# Function to source files based on the shell
source_file()
# shellcheck disable=SC1091
source "$DOTFILES/config/shared.sh"
# shellcheck disable=SC1090
source "${DOTFILES}/local/bin/msgr"
# Get description from a script file's @description tag
get_script_description()
{
local file=$1
case "$CURRENT_SHELL" in
fish)
if [[ -f "$file.fish" ]]; then
# shellcheck disable=SC1090
source "$file.fish"
else
echo "Fish shell file not found: $file.fish"
exit 1
fi
;;
sh | bash | zsh)
# shellcheck disable=SC1090
source "$file"
;;
*)
echo "Unsupported shell: $CURRENT_SHELL"
exit 1
;;
esac
local file="$1"
local desc
desc=$(sed -n '/@description/s/.*@description *\(.*\)/\1/p' "$file" | head -1)
echo "${desc:-No description available}"
}
# Modify the source commands to use the new function
source_file "$DOTFILES/config/shared.sh"
source_file "${DOTFILES}/local/bin/msgr"
# Menu builder
menu_builder()
{
@@ -54,9 +53,9 @@ menu_builder()
local commands=("${@:2}")
local width=60
printf "\n%s\n" "$(printf '%.s─' $(seq 1 $width))"
printf "\n%s\n" "$(printf '%.s─' $(seq 1 "$width"))"
printf "%-${width}s\n" " $title"
printf "%s\n" "$(printf '%.s─' $(seq 1 $width))"
printf "%s\n" "$(printf '%.s─' $(seq 1 "$width"))"
for cmd in "${commands[@]}"; do
local name=${cmd%%:*}
@@ -65,6 +64,7 @@ menu_builder()
done
}
# Handle install section commands
section_install()
{
USAGE_PREFIX="$SCRIPT install <command>"
@@ -80,7 +80,6 @@ section_install()
"imagick:Install ImageMagick CLI"
"macos:Setup nice macOS defaults"
"npm-packages:Install NPM Packages"
"ntfy:Install ntfy"
"nvm-latest:Install latest lts node using nvm"
"nvm:Install Node Version Manager (nvm)"
"z:Install z"
@@ -100,6 +99,7 @@ section_install()
$0 install npm-packages
$0 install z
msgr msg "Reloading configurations again..."
# shellcheck disable=SC1091
source "$DOTFILES/config/shared.sh"
msgr yay "All done!"
;;
@@ -112,7 +112,7 @@ section_install()
cheat-databases)
msgr run "Installing cheat databases..."
for database in "$DOTFILES"/scripts/install-cheat-*; do
for database in "$DOTFILES"/scripts/install-cheat-*.sh; do
bash "$database" \
&& msgr run_done "Cheat: $database run"
done
@@ -194,6 +194,7 @@ section_install()
esac
}
# Handle Homebrew section commands
section_brew()
{
USAGE_PREFIX="$SCRIPT brew <command>"
@@ -208,89 +209,91 @@ section_brew()
"untracked:List untracked brew packages"
)
x-have brew && {
case "$1" in
install)
brew bundle install --file="$BREWFILE" --force --quiet && msgr yay "Done!"
;;
if ! x-have brew; then
msgr warn "brew not available, skipping"
return 0
fi
update)
brew update && brew outdated && brew upgrade && brew cleanup
msgr yay "Done!"
;;
case "$1" in
install)
brew bundle install --file="$BREWFILE" --force --quiet && msgr yay "Done!"
;;
updatebundle)
# Updates .dotfiles/homebrew/Brewfile with descriptions
brew bundle dump \
--force \
--file="$BREWFILE" \
--cleanup \
--tap \
--formula \
--cask \
--describe && msgr yay "Done!"
;;
update)
brew update && brew outdated && brew upgrade && brew cleanup
msgr yay "Done!"
;;
leaves)
brew leaves --installed-on-request
;;
updatebundle)
# Updates .dotfiles/homebrew/Brewfile with descriptions
brew bundle dump \
--force \
--file="$BREWFILE" \
--cleanup \
--tap \
--formula \
--cask \
--describe && msgr yay "Done!"
;;
untracked)
declare -a BREW_LIST_ALL
while IFS= read -r line; do
BREW_LIST_ALL+=("$line")
done < <(brew list --formula --installed-on-request -1 --full-name)
while IFS= read -r c; do
BREW_LIST_ALL+=("$c")
done < <(brew list --cask -1 --full-name)
leaves)
brew leaves --installed-on-request
;;
# Remove entries that are installed as dependencies
declare -a BREW_LIST_DEPENDENCIES
while IFS= read -r l; do
BREW_LIST_DEPENDENCIES+=("$l")
done < <(brew list -1 --installed-as-dependency)
untracked)
declare -a BREW_LIST_ALL
while IFS= read -r line; do
BREW_LIST_ALL+=("$line")
done < <(brew list --formula --installed-on-request -1 --full-name)
while IFS= read -r c; do
BREW_LIST_ALL+=("$c")
done < <(brew list --cask -1 --full-name)
declare -a BREW_LIST_BUNDLED
while IFS= read -r b; do
BREW_LIST_BUNDLED+=("$b")
done < <(brew bundle list --all --file="$BREWFILE")
# Remove entries that are installed as dependencies
declare -a BREW_LIST_DEPENDENCIES
while IFS= read -r l; do
BREW_LIST_DEPENDENCIES+=("$l")
done < <(brew list -1 --installed-as-dependency)
declare -a BREW_LIST_TRACKED_WITHOUT_DEPS
for f in "${BREW_LIST_ALL[@]}"; do
# shellcheck disable=SC2199
if [[ " ${BREW_LIST_DEPENDENCIES[@]} " != *" ${f} "* ]]; then
BREW_LIST_TRACKED_WITHOUT_DEPS+=("$f")
fi
done
declare -a BREW_LIST_BUNDLED
while IFS= read -r b; do
BREW_LIST_BUNDLED+=("$b")
done < <(brew bundle list --all --file="$BREWFILE")
array_diff BREW_LIST_UNTRACKED BREW_LIST_TRACKED_WITHOUT_DEPS BREW_LIST_BUNDLED
# If there are no untracked packages, exit
if [ ${#BREW_LIST_UNTRACKED[@]} -eq 0 ]; then
msgr yay "No untracked packages found!"
exit 0
declare -a BREW_LIST_TRACKED_WITHOUT_DEPS
for f in "${BREW_LIST_ALL[@]}"; do
# shellcheck disable=SC2199
if [[ " ${BREW_LIST_DEPENDENCIES[@]} " != *" ${f} "* ]]; then
BREW_LIST_TRACKED_WITHOUT_DEPS+=("$f")
fi
done
echo "Untracked:"
for f in "${BREW_LIST_UNTRACKED[@]}"; do
echo " $f"
done
;;
array_diff BREW_LIST_UNTRACKED BREW_LIST_TRACKED_WITHOUT_DEPS BREW_LIST_BUNDLED
autoupdate)
brew autoupdate delete
brew autoupdate start 43200 --upgrade --cleanup --immediate
;;
# If there are no untracked packages, return
if [ ${#BREW_LIST_UNTRACKED[@]} -eq 0 ]; then
msgr yay "No untracked packages found!"
return 0
fi
clean) brew bundle cleanup --file="$BREWFILE" && msgr yay "Done!" ;;
echo "Untracked:"
for f in "${BREW_LIST_UNTRACKED[@]}"; do
echo " $f"
done
;;
*) menu_builder "$USAGE_PREFIX" "${MENU[@]}" ;;
esac
}
autoupdate)
brew autoupdate delete
brew autoupdate start 43200 --upgrade --cleanup --immediate
;;
! x-have brew && menu_builder "$USAGE_PREFIX" "brew not available on this system"
clean) brew bundle cleanup --file="$BREWFILE" && msgr yay "Done!" ;;
*) menu_builder "$USAGE_PREFIX" "${MENU[@]}" ;;
esac
}
# Handle helper utility commands
section_helpers()
{
USAGE_PREFIX="$SCRIPT helpers <command>"
@@ -305,10 +308,10 @@ section_helpers()
"wezterm:Show wezterm keybindings"
)
CMD="$1"
shift
SECTION="$1"
shift
CMD="${1:-}"
[[ $# -gt 0 ]] && shift
SECTION="${1:-}"
[[ $# -gt 0 ]] && shift
case "$CMD" in
path)
@@ -367,6 +370,7 @@ section_helpers()
esac
}
# Handle apt package manager commands
section_apt()
{
USAGE_PREFIX="$SCRIPT apt <command>"
@@ -379,57 +383,63 @@ section_apt()
"clean:Clean apt cache"
)
x-have apt && {
case "$1" in
upkeep)
sudo apt update \
&& sudo apt upgrade -y \
&& sudo apt autoremove -y \
&& sudo apt clean
;;
if ! x-have apt; then
msgr warn "apt not available, skipping"
return 0
fi
install)
# if apt.txt is not found, exit
[ ! -f "$DOTFILES/tools/apt.txt" ] && msgr err "apt.txt not found" && exit 0
case "$1" in
upkeep)
sudo apt update \
&& sudo apt upgrade -y \
&& sudo apt autoremove -y \
&& sudo apt clean
;;
# Load apt.txt, remove comments (even if trailing comment) and empty lines.
#
# Ignoring "Quote this to prevent word splitting."
install)
# if apt.txt is not found, return with error
if [ ! -f "$DOTFILES/tools/apt.txt" ]; then
msgr err "apt.txt not found"
return 1
fi
# Load apt.txt, remove comments (even if trailing comment) and empty lines.
#
# Ignoring "Quote this to prevent word splitting."
# shellcheck disable=SC2046
sudo apt install \
-y $(
grep -vE '^\s*#' "$DOTFILES/tools/apt.txt" \
| sed -e 's/#.*//' \
| tr '\n' ' '
)
# If there's a apt.txt file under hosts/$hostname/apt.txt,
# run install on those lines too.
HOSTNAME=$(hostname -s)
HOST_APT="$DOTFILES/hosts/$HOSTNAME/apt.txt"
[[ -f $HOST_APT ]] && {
# shellcheck disable=SC2046
sudo apt install \
-y $(
grep -vE '^\s*#' "$DOTFILES/tools/apt.txt" \
| sed -e 's/#.*//' \
| tr '\n' ' '
)
sudo apt install -y $(
grep -vE '^\s*#' "$HOST_APT" \
| sed -e 's/#.*//' \
| tr '\n' ' '
)
}
# If there's a apt.txt file under hosts/$hostname/apt.txt,
# run install on those lines too.
HOSTNAME=$(hostname -s)
HOST_APT="$DOTFILES/hosts/$HOSTNAME/apt.txt"
[[ -f $HOST_APT ]] && {
# shellcheck disable=SC2046
sudo apt install -y $(
grep -vE '^\s*#' "$HOST_APT" \
| sed -e 's/#.*//' \
| tr '\n' ' '
)
}
# Try this for an alternative way to install packages
# xargs -a <(awk '! /^ *(#|$)/' "$packagelist") -r -- sudo apt-get install -y
;;
# Try this for an alternative way to install packages
# xargs -a <(awk '! /^ *(#|$)/' "$packagelist") -r -- sudo apt-get install -y
;;
update) sudo apt update ;;
upgrade) sudo apt upgrade -y ;;
autoremove) sudo apt autoremove -y ;;
clean) sudo apt clean ;;
*) menu_builder "$USAGE_PREFIX" "${MENU[@]}" ;;
esac
}
! x-have apt && menu_builder "$USAGE_PREFIX" "apt not available on this system"
update) sudo apt update ;;
upgrade) sudo apt upgrade -y ;;
autoremove) sudo apt autoremove -y ;;
clean) sudo apt clean ;;
*) menu_builder "$USAGE_PREFIX" "${MENU[@]}" ;;
esac
}
# Handle documentation generation commands
section_docs()
{
USAGE_PREFIX="$SCRIPT docs <command>"
@@ -454,6 +464,7 @@ section_docs()
esac
}
# Handle dotfiles formatting and reset commands
section_dotfiles()
{
USAGE_PREFIX="$SCRIPT dotfiles <command>"
@@ -521,6 +532,7 @@ section_dotfiles()
esac
}
# Handle system check commands (arch, hostname)
section_check()
{
USAGE_PREFIX="$SCRIPT check <command>"
@@ -534,50 +546,36 @@ section_check()
case "$1" in
a | arch)
[[ $2 == "" ]] && echo "$X_ARCH" && exit 0
[[ $X_ARCH == "$2" ]] && exit 0 || exit 1
[[ $2 == "" ]] && echo "$X_ARCH" && return 0
[[ $X_ARCH == "$2" ]] && return 0 || return 1
;;
h | host | hostname)
[[ $2 == "" ]] && echo "$X_HOSTNAME" && exit 0
[[ $X_HOSTNAME == "$2" ]] && exit 0 || exit 1
[[ $2 == "" ]] && echo "$X_HOSTNAME" && return 0
[[ $X_HOSTNAME == "$2" ]] && return 0 || return 1
;;
*) menu_builder "$USAGE_PREFIX" "${MENU[@]}" ;;
esac
}
# Handle install script execution
section_scripts()
{
USAGE_PREFIX="$SCRIPT scripts <command>"
# Get description from a file
get_script_description()
{
local file
local desc
file="$1"
desc=$(sed -n '/@description/s/.*@description *\(.*\)/\1/p' "$file" | head -1)
echo "${desc:-No description available}"
}
# Collect scripts and their descriptions
declare -A SCRIPT_MENU
local menu_items=()
for script in "$DOTFILES/scripts/install-"*.sh; do
if [ -f "$script" ]; then
name=$(basename "$script" .sh | sed 's/install-//')
desc=$(get_script_description "$script")
SCRIPT_MENU[$name]="$desc"
menu_items+=("$name:$desc")
fi
done
case "$1" in
"")
# Show the menu
local menu_items=()
for name in "${!SCRIPT_MENU[@]}"; do
menu_items+=("$name:${SCRIPT_MENU[$name]}")
done
menu_builder "$USAGE_PREFIX" "${menu_items[@]}"
;;
*)
@@ -609,7 +607,7 @@ section_tests()
echo " $i"
done
;;
msg)
msgr)
# shellcheck disable=SC1010
msgr done "msgr done"
msgr done_suffix "msgr done_suffix"
@@ -629,11 +627,12 @@ section_tests()
esac
}
# Display main usage information for all sections
usage()
{
echo ""
msgr prompt "Usage: $SCRIPT <section> <command>"
echo $" Empty <command> prints <section> help."
echo " Empty <command> prints <section> help."
echo ""
section_install
echo ""
@@ -652,10 +651,11 @@ usage()
section_helpers
}
# Parse section argument and dispatch to handler
main()
{
SECTION="$1"
shift
SECTION="${1:-}"
[[ $# -gt 0 ]] && shift
# The main loop. The first keyword after $0 triggers section, or help.
case "$SECTION" in
install) section_install "$@" ;;
@@ -667,7 +667,7 @@ main()
docs) section_docs "$@" ;;
scripts) section_scripts "$@" ;;
tests) section_tests "$@" ;;
*) usage && exit 0 ;;
*) usage && return 0 ;;
esac
}

View File

@@ -22,32 +22,37 @@ if [ "$DEBUG" -eq 1 ]; then
set -x
fi
# Output functions
# Print an error message in red
msg_err()
{
echo -e "\e[31m$*\e[0m" >&2
}
# Print a success message in green
msg_success()
{
echo -e "\e[32m$*\e[0m"
}
# Print a warning message in yellow
msg_warn()
{
echo -e "\e[33m$*\e[0m" >&2
}
# Print an info message in blue
msg_info()
{
echo -e "\e[36m$*\e[0m"
}
# Print a debug message when verbose mode is on
msg_debug()
{
[[ $VERBOSE -eq 1 ]] && echo -e "\e[35m$*\e[0m"
}
# Display usage information and examples
show_help()
{
cat << EOF

View File

@@ -90,13 +90,14 @@ declare -A DIR_HAS_REPOS
# Record start time
START_TIME=$(date +%s)
# Logging functions
# Log an error message
log_error()
{
print_color "31" "ERROR:" >&2
echo " $*" >&2
}
# Log an informational message
log_info()
{
if [[ $VERBOSE -eq 1 ]]; then
@@ -105,6 +106,7 @@ log_info()
fi
}
# Log a warning message
log_warn()
{
print_color "33" "WARNING:" >&2
@@ -911,6 +913,7 @@ process_in_parallel()
echo -e "\nProcessed $total repositories in $dur (Total runtime: $runtime)"
}
# Check a directory for git status with progress tracking
check_directory_with_progress()
{
local dir

View File

@@ -23,21 +23,25 @@ CLR_RESET="\033[0m"
# │ Color functions │
# ╰──────────────────────────────────────────────────────────╯
# Wrap text in red color
function __color_red()
{
local MSG="$1"
echo -e "${CLR_RED}${MSG}${CLR_RESET}"
}
# Wrap text in yellow color
function __color_yellow()
{
local MSG="$1"
echo -e "${CLR_YELLOW}${MSG}${CLR_RESET}"
}
# Wrap text in green color
function __color_green()
{
local MSG="$1"
echo -e "${CLR_GREEN}${MSG}${CLR_RESET}"
}
# Wrap text in blue color
function __color_blue()
{
local MSG="$1"
@@ -48,36 +52,43 @@ function __color_blue()
# │ Helpers │
# ╰──────────────────────────────────────────────────────────╯
# Print blue arrow marker
function __log_marker()
{
echo -e "${CLR_BLUE}➜${CLR_RESET}"
}
# Print green checkmark marker
function __log_marker_ok()
{
echo -e "${CLR_GREEN}✔${CLR_RESET}"
}
# Print blue checkmark marker
function __log_marker_ok_blue()
{
echo -e "${CLR_BLUE}✔${CLR_RESET}"
}
# Print yellow warning marker
function __log_marker_warn()
{
echo -e "${CLR_YELLOW}⁕${CLR_RESET}"
}
# Print yellow question marker
function __log_marker_question()
{
echo -e "${CLR_YELLOW}?${CLR_RESET}"
}
# Print red error marker
function __log_marker_err()
{
echo -e "${CLR_RED}⛌${CLR_RESET}"
}
# Print indentation spacing
function __log_indent()
{
echo " "
@@ -87,71 +98,85 @@ function __log_indent()
# │ Log functions │
# ╰──────────────────────────────────────────────────────────╯
# Print a message with arrow marker
function msg()
{
echo -e "$(__log_marker) $1"
}
# Print a celebration message
function msg_yay()
{
echo -e "🎉 $1"
}
# Print a celebration message with checkmark
function msg_yay_done()
{
echo -e "🎉 $1 ...$(__log_marker_ok)"
}
# Print a message with completion checkmark
function msg_done()
{
echo -e "$(__log_marker) $1 ...$(__log_marker_ok)"
}
# Print a completion checkmark suffix
function msg_done_suffix()
{
echo -e "$(__log_marker) ...$(__log_marker_ok)"
}
# Print a prompt-style message
function msg_prompt()
{
echo -e "$(__log_marker_question) $1"
}
# Print a prompt message with checkmark
function msg_prompt_done()
{
echo -e "$(__log_marker_question) $1 ...$(__log_marker_ok)"
}
# Print an indented message
function msg_nested()
{
echo -e "$(__log_indent)$(__log_marker) $1"
}
# Print an indented message with checkmark
function msg_nested_done()
{
echo -e "$(__log_indent)$(__log_marker) $1 ...$(__log_marker_ok)"
}
# Print a running-task message in green
function msg_run()
{
echo -e "${CLR_GREEN}➜ $1${CLR_RESET} $2"
}
# Print a running-task message with checkmark
function msg_run_done()
{
echo -e "${CLR_GREEN}➜ $1${CLR_RESET} $2 ...$(__log_marker_ok)"
}
# Print an ok/success message
function msg_ok()
{
echo -e "$(__log_marker_ok) $1"
}
# Print a warning message
function msg_warn()
{
echo -e "$(__log_marker_warn) $1"
}
# Print an error message
function msg_err()
{
echo -e "$(__log_marker_err) $1"
@@ -174,6 +199,7 @@ ask()
# If this is being sourced, no need to run the next steps.
[ "$sourced" = 1 ] && return
# Run visual tests for all message types
function __tests()
{
msg "[ msg ]"
@@ -192,6 +218,7 @@ function __tests()
msg_yay_done "[ yay_done ]"
}
# Show usage information and examples
function usage()
{
echo "usage: msgr [type] [message] [optional second message]"

View File

@@ -19,7 +19,7 @@ set -euo pipefail # Add error handling
LATEST_VERSION_FORMULA="php" # The formula name for latest PHP version
PHP_VERSION_FILE=".php-version" # File name to look for when auto-switching
# Switch brew php version
# Verify that Homebrew is installed
function check_dependencies()
{
if ! command -v brew > /dev/null 2>&1; then
@@ -28,6 +28,7 @@ function check_dependencies()
fi
}
# Display help message and usage examples
function usage()
{
echo "Brew PHP Switcher - Switch between PHP versions installed via Homebrew"
@@ -53,6 +54,7 @@ function usage()
exit 0
}
# List all PHP versions installed via Homebrew
function list_php_versions()
{
# Check Homebrew's installation path for PHP versions
@@ -185,6 +187,7 @@ function list_php_versions()
done
}
# Convert a version number to a Homebrew formula name
function get_php_formula_for_version()
{
local version="$1"
@@ -199,6 +202,7 @@ function get_php_formula_for_version()
echo "php@$version"
}
# Check if a Homebrew formula is installed
function check_formula_installed()
{
local formula="$1"
@@ -216,6 +220,7 @@ function check_formula_installed()
return 1
}
# Unlink the currently active PHP version
function unlink_current_php()
{
local current_formula=""
@@ -241,6 +246,7 @@ function unlink_current_php()
fi
}
# Link a specific PHP formula as the active version
function link_php_version()
{
local formula="$1"
@@ -265,6 +271,7 @@ function link_php_version()
fi
}
# Display the currently active PHP version
function get_current_version()
{
if ! command -v php > /dev/null 2>&1; then
@@ -300,6 +307,7 @@ function get_current_version()
fi
}
# Validate PHP version format (x.y or latest)
function validate_version()
{
local version="$1"
@@ -312,6 +320,7 @@ function validate_version()
fi
}
# Search for .php-version file in directory hierarchy
function find_php_version_file()
{
local dir="$PWD"
@@ -334,6 +343,7 @@ function find_php_version_file()
return 1
}
# Auto-switch PHP based on .php-version file
function auto_switch_php_version()
{
local version_file
@@ -360,6 +370,7 @@ function auto_switch_php_version()
switch_php_version "$version"
}
# Switch to a specific PHP version
function switch_php_version()
{
local version="$1"
@@ -398,6 +409,7 @@ function switch_php_version()
echo "PHP executable: $(command -v php)"
}
# Parse arguments and dispatch to appropriate action
function main()
{
local version=""

View File

@@ -5,6 +5,7 @@
#
# Modified by Ismo Vuorinen <https://github.com/ivuorinen> 2023
# Display usage information for pushover
__pushover_usage()
{
printf "pushover <options> <message>\n"
@@ -23,6 +24,7 @@ __pushover_usage()
return 1
}
# Format an optional curl form field
__pushover_opt_field()
{
field=$1
@@ -33,6 +35,7 @@ __pushover_opt_field()
fi
}
# Send a pushover notification via curl
__pushover_send_message()
{
device="${1:-}"

View File

@@ -8,7 +8,7 @@ set -euo pipefail
# Enable verbosity with VERBOSE=1
VERBOSE="${VERBOSE:-0}"
A_DIR="$HOME/.config/alacritty"
A_DIR="${XDG_CONFIG_HOME:-$HOME/.config}/alacritty"
# Function to print usage information
usage()

View File

@@ -7,13 +7,15 @@
# Author: Ismo Vuorinen 2025
# License: MIT
set -euo pipefail
# Check if the user has provided a directory as an argument
if [ "$1" ]; then
if [ "${1:-}" ]; then
# Check if the directory exists
if [ -d "$1" ]; then
CLEANDIR="$1"
else
msgr err "Directory $1 does not exist."
echo "Error: Directory $1 does not exist." >&2
exit 1
fi
else
@@ -27,7 +29,7 @@ remove_node_modules_vendor()
# If the directory is a symlink, skip it
if [ -L "$dir" ]; then
msgr msg "Skipping symlink $dir"
echo "Skipping symlink $dir"
return
fi
@@ -35,18 +37,18 @@ remove_node_modules_vendor()
if [ -d "$dir" ]; then
# If node_modules or vendor folder exists, remove it and all its contents
if [ -d "$dir/node_modules" ]; then
msgr run "Removing $dir/node_modules"
echo "Removing $dir/node_modules"
rm -rf "$dir/node_modules"
fi
if [ -d "$dir/vendor" ]; then
msgr run "Removing $dir/vendor"
echo "Removing $dir/vendor"
rm -rf "$dir/vendor"
fi
# Recursively check subdirectories
for item in "$dir"/*; do
remove_node_modules_vendor "$item"
[ -d "$item" ] && remove_node_modules_vendor "$item"
done
fi
}

View File

@@ -10,6 +10,7 @@ VERSION="1.0.0"
LANG_MAP="c:.c,.h|cpp:.cpp,.cc,.cxx,.hpp,.hxx|csharp:.cs|go:.go|java:.java|
javascript:.js,.jsx,.mjs,.ts,.tsx|python:.py|ruby:.rb|swift:.swift"
# Display usage information and options
usage()
{
cat << EOF
@@ -24,22 +25,26 @@ EOF
exit "${1:-0}"
}
# Log a timestamped message to stderr
log()
{
printf '[%s] %s\n' "$(date '+%H:%M:%S')" "$*" >&2
}
# Log an error message and exit
err()
{
log "ERROR: $*"
exit 1
}
# Verify codeql binary is available in PATH
check_codeql()
{
command -v codeql > /dev/null 2>&1 || err "codeql binary not found in PATH"
log "Found codeql: $(codeql version --format=terse)"
}
# Get or create the CodeQL cache directory
get_cache_dir()
{
cache="${XDG_CACHE_HOME:-$HOME/.cache}/codeql"
@@ -47,6 +52,7 @@ get_cache_dir()
printf '%s' "$cache"
}
# Detect supported programming languages in source path
detect_languages()
{
src_path="$1"
@@ -85,6 +91,7 @@ detect_languages()
printf '%s' "$detected" | tr ' ' '\n' | sort -u | tr '\n' ' ' | sed 's/ $//'
}
# Create a CodeQL database for a language
create_database()
{
lang="$1"
@@ -98,6 +105,7 @@ create_database()
--overwrite
}
# Display analysis result statistics from SARIF file
show_results_stats()
{
sarif_file="$1"
@@ -126,6 +134,7 @@ show_results_stats()
return 0
}
# Run CodeQL analysis for a single language
analyze_language()
{
lang="$1"
@@ -172,6 +181,7 @@ analyze_language()
rm -rf "$db_path"
}
# Parse arguments and run CodeQL analysis pipeline
main()
{
src_path="."

View File

@@ -24,7 +24,7 @@ str_to_operator = {
def vercmp(expr):
"""Version Comparison function."""
words = expr.split()
comparisons = [words[i: i + 3] for i in range(0, len(words) - 2, 2)]
comparisons = [words[i : i + 3] for i in range(0, len(words) - 2, 2)]
for left, op_str, right in comparisons:
compare_op = str_to_operator[op_str]
if not compare_op(version.parse(left), version.parse(right)):
@@ -63,7 +63,7 @@ def test():
except KeyError:
pass
else:
assert False, "invalid operator did not raise"
raise AssertionError("invalid operator did not raise")
if __name__ == "__main__":

View File

@@ -1,4 +1,4 @@
#!/bin/bash
#!/usr/bin/env bash
#
# List environment variables grouped by the first part before underscore
# protecting environment variables that possibly contain sensitive information.
@@ -190,6 +190,7 @@ get_custom_group()
return 1
}
# Check if a key matches the skipped keys list
is_skipped()
{
local key=$1

View File

@@ -1,18 +1,26 @@
#!/usr/bin/env bash
#
# foreach <folder> <commands that should be run to each file>
# foreach "ls -d */" "git status" # run git status in each folder
# Run a command in each directory matching a pattern.
#
# Usage: x-foreach <listing-command> <command> [args...]
# x-foreach "ls -d */" "git status"
#
# Source: https://github.com/mvdan/dotfiles/blob/master/.bin/foreach
cmd=$1
set -euo pipefail
if [ $# -lt 2 ]; then
echo "Usage: $0 <listing-command> <command> [args...]"
exit 1
fi
listing=$1
shift
for dir in $($cmd); do
for dir in $(eval "$listing"); do
(
echo "$dir"
cd "$dir" || exit 1
# shellcheck disable=SC2294,SC2034
eval "$@" # allow multiple commands like "foo && bar"
"$@"
)
done

View File

@@ -1,5 +1,4 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# Python script to find the largest files in a git repository.
# The general method is based on the script in this blog post:
@@ -32,60 +31,60 @@
# vim:tw=120:ts=4:ft=python:norl:
from subprocess import check_output, Popen, PIPE
import argparse
import glob
import signal
import sys
from subprocess import PIPE, Popen, check_output # nosec B404
sortByOnDiskSize = False
class Blob(object):
sha1 = ''
size = 0
packed_size = 0
path = ''
def __init__(self, line):
cols = line.split()
self.sha1, self.size, self.packed_size = cols[0], int(cols[2]), int(cols[3])
class Blob:
sha1 = ""
size = 0
packed_size = 0
path = ""
def __repr__(self):
return '{} - {} - {} - {}'.format(
self.sha1, self.size, self.packed_size, self.path)
def __init__(self, line):
cols = line.split()
self.sha1, self.size, self.packed_size = cols[0], int(cols[2]), int(cols[3])
def __lt__(self, other):
if (sortByOnDiskSize):
return self.size < other.size
else:
return self.packed_size < other.packed_size
def __repr__(self):
return f"{self.sha1} - {self.size} - {self.packed_size} - {self.path}"
def csv_line(self):
return "{},{},{},{}".format(
self.size/1024, self.packed_size/1024, self.sha1, self.path)
def __lt__(self, other):
if sortByOnDiskSize:
return self.size < other.size
else:
return self.packed_size < other.packed_size
def csv_line(self):
return f"{self.size / 1024},{self.packed_size / 1024},{self.sha1},{self.path}"
def main():
global sortByOnDiskSize
global sortByOnDiskSize
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGINT, signal_handler)
args = parse_arguments()
sortByOnDiskSize = args.sortByOnDiskSize
size_limit = 1024*args.filesExceeding
args = parse_arguments()
sortByOnDiskSize = args.sortByOnDiskSize
size_limit = 1024 * args.filesExceeding
if args.filesExceeding > 0:
print("Finding objects larger than {}kB…".format(args.filesExceeding))
else:
print("Finding the {} largest objects…".format(args.matchCount))
if args.filesExceeding > 0:
print(f"Finding objects larger than {args.filesExceeding}kB…")
else:
print(f"Finding the {args.matchCount} largest objects…")
blobs = get_top_blobs(args.matchCount, size_limit)
blobs = get_top_blobs(args.matchCount, size_limit)
populate_blob_paths(blobs)
print_out_blobs(blobs)
populate_blob_paths(blobs)
print_out_blobs(blobs)
def get_top_blobs(count, size_limit):
"""Get top blobs from git repository
"""Get top blobs from git repository
Args:
count (int): How many items to return
@@ -93,110 +92,141 @@ def get_top_blobs(count, size_limit):
Returns:
dict: Dictionary of Blobs
"""
sort_column = 4
"""
sort_column = 4
if sortByOnDiskSize:
sort_column = 3
if sortByOnDiskSize:
sort_column = 3
verify_pack = "git verify-pack -v `git rev-parse --git-dir`/objects/pack/pack-*.idx | grep blob | sort -k{}nr".format(sort_column) # noqa: E501
output = check_output(verify_pack, shell=True).decode('utf-8').strip().split("\n")[:-1] # noqa: E501
git_dir = check_output(["git", "rev-parse", "--git-dir"]).decode("utf-8").strip() # nosec B603 # nosemgrep
idx_files = glob.glob(f"{git_dir}/objects/pack/pack-*.idx")
verify_pack = Popen( # nosec B603
["git", "verify-pack", "-v", *idx_files],
stdout=PIPE,
stderr=PIPE,
)
grep_blob = Popen(["grep", "blob"], stdin=verify_pack.stdout, stdout=PIPE, stderr=PIPE) # nosec B603
if verify_pack.stdout:
verify_pack.stdout.close()
sort_cmd = Popen( # nosec B603
["sort", f"-k{sort_column}nr"],
stdin=grep_blob.stdout,
stdout=PIPE,
stderr=PIPE,
)
if grep_blob.stdout:
grep_blob.stdout.close()
output = [line for line in sort_cmd.communicate()[0].decode("utf-8").strip().split("\n") if line]
blobs = dict()
# use __lt__ to do the appropriate comparison
compare_blob = Blob("a b {} {} c".format(size_limit, size_limit))
for obj_line in output:
blob = Blob(obj_line)
blobs = {}
# use __lt__ to do the appropriate comparison
compare_blob = Blob(f"a b {size_limit} {size_limit} c")
for obj_line in output:
blob = Blob(obj_line)
if size_limit > 0:
if compare_blob < blob:
blobs[blob.sha1] = blob
else:
break
else:
blobs[blob.sha1] = blob
if size_limit > 0:
if compare_blob < blob:
blobs[blob.sha1] = blob
else:
break
else:
blobs[blob.sha1] = blob
if len(blobs) == count:
break
if len(blobs) == count:
break
return blobs
return blobs
def populate_blob_paths(blobs):
"""Populate blob paths that only have a path
"""Populate blob paths that only have a path
Args:
blobs (Blob, dict): Dictionary of Blobs
"""
if len(blobs):
print("Finding object paths…")
Args:
blobs (Blob, dict): Dictionary of Blobs
"""
if len(blobs):
print("Finding object paths…")
# Only include revs which have a path. Other revs aren't blobs.
rev_list = "git rev-list --all --objects | awk '$2 {print}'"
all_object_lines = check_output(rev_list, shell=True).decode('utf-8').strip().split("\n")[:-1] # noqa: E501
outstanding_keys = list(blobs.keys())
# Only include revs which have a path. Other revs aren't blobs.
rev_list = Popen(["git", "rev-list", "--all", "--objects"], stdout=PIPE, stderr=PIPE) # nosec B603
awk_filter = Popen(["awk", "$2 {print}"], stdin=rev_list.stdout, stdout=PIPE, stderr=PIPE) # nosec B603
if rev_list.stdout:
rev_list.stdout.close()
all_object_lines = [line for line in awk_filter.communicate()[0].decode("utf-8").strip().split("\n") if line]
outstanding_keys = list(blobs.keys())
for line in all_object_lines:
cols = line.split()
sha1, path = cols[0], " ".join(cols[1:])
for line in all_object_lines:
cols = line.split()
sha1, path = cols[0], " ".join(cols[1:])
if (sha1 in outstanding_keys):
outstanding_keys.remove(sha1)
blobs[sha1].path = path
if sha1 in outstanding_keys:
outstanding_keys.remove(sha1)
blobs[sha1].path = path
# short-circuit the search if we're done
if not len(outstanding_keys):
break
# short-circuit the search if we're done
if not len(outstanding_keys):
break
def print_out_blobs(blobs):
if len(blobs):
csv_lines = ["size,pack,hash,path"]
if len(blobs):
csv_lines = ["size,pack,hash,path"]
for blob in sorted(blobs.values(), reverse=True):
csv_lines.append(blob.csv_line())
for blob in sorted(blobs.values(), reverse=True):
csv_lines.append(blob.csv_line())
command = ["column", "-t", "-s", ","]
p = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE)
command = ["column", "-t", "-s", ","]
p = Popen(command, stdin=PIPE, stdout=PIPE, stderr=PIPE)
# Encode the input as bytes
input_data = ("\n".join(csv_lines) + "\n").encode()
# Encode the input as bytes
input_data = ("\n".join(csv_lines) + "\n").encode()
stdout, _ = p.communicate(input_data)
stdout, _ = p.communicate(input_data)
print("\nAll sizes in kB. The pack column is the compressed size of the object inside the pack file.\n") # noqa: E501
print("\nAll sizes in kB. The pack column is the compressed size of the object inside the pack file.\n")
print(stdout.decode("utf-8").rstrip('\n'))
else:
print("No files found which match those criteria.")
print(stdout.decode("utf-8").rstrip("\n"))
else:
print("No files found which match those criteria.")
def parse_arguments():
parser = argparse.ArgumentParser(
description='List the largest files in a git repository'
)
parser.add_argument(
'-c', '--match-count', dest='matchCount', type=int, default=10,
help='Files to return. Default is 10. Ignored if --files-exceeding is used.'
)
parser.add_argument(
'--files-exceeding', dest='filesExceeding', type=int, default=0,
help='The cutoff amount, in KB. Files with a pack size (or physical size, with -p) larger than this will be printed.' # noqa: E501
)
parser.add_argument(
'-p', '--physical-sort', dest='sortByOnDiskSize',
action='store_true', default=False,
help='Sort by the on-disk size. Default is to sort by the pack size.'
)
parser = argparse.ArgumentParser(description="List the largest files in a git repository")
parser.add_argument(
"-c",
"--match-count",
dest="matchCount",
type=int,
default=10,
help="Files to return. Default is 10. Ignored if --files-exceeding is used.",
)
parser.add_argument(
"--files-exceeding",
dest="filesExceeding",
type=int,
default=0,
help=(
"The cutoff amount, in KB. Files with a pack size"
" (or physical size, with -p) larger than this will be printed."
),
)
parser.add_argument(
"-p",
"--physical-sort",
dest="sortByOnDiskSize",
action="store_true",
default=False,
help="Sort by the on-disk size. Default is to sort by the pack size.",
)
return parser.parse_args()
return parser.parse_args()
def signal_handler(signal, frame):
print('Caught Ctrl-C. Exiting.')
def signal_handler(_signal, _frame):
print("Caught Ctrl-C. Exiting.")
sys.exit(0)
# Default function is main()
if __name__ == '__main__':
main()
if __name__ == "__main__":
main()

View File

@@ -1,4 +1,14 @@
#!/usr/bin/env bash
#
# Display external IP address.
#
# Source: https://github.com/thirtythreeforty/dotfiles/blob/master/bin/extip
curl icanhazip.com "${@}"
set -euo pipefail
if ! command -v curl &> /dev/null; then
echo "Error: curl is required" >&2
exit 1
fi
curl -sf icanhazip.com

View File

@@ -1,4 +1,4 @@
#!/bin/bash
#!/usr/bin/env bash
#
# x-localip: script to display the local IP addresses of the system
#

View File

@@ -1,50 +0,0 @@
#!/usr/bin/env bash
#
# Create a directory and cd into it
# Usage: x-mkd <dir>
set -euo pipefail
# Set verbosity with VERBOSE=1
VERBOSE="${VERBOSE:-0}"
# Function to print usage information
usage()
{
echo "Usage: $0 <dir>"
exit 1
}
# Function to print messages if VERBOSE is enabled
# $1 - message (string)
msg()
{
[[ "$VERBOSE" -eq 1 ]] && echo "$1"
}
# Function to create a directory and cd into it
# $1 - directory to create and cd into (string)
mkcd()
{
local dir=$1
mkdir -p "$dir" && msg "Directory $dir created"
cd "$dir" || {
msg "Failed to cd into $dir"
exit 1
}
msg "Changed directory to $dir"
}
# Main function
main()
{
if [ "$#" -ne 1 ]; then
usage
fi
mkcd "$1"
}
main "$@"

View File

@@ -1,19 +0,0 @@
# x-mkd
Create a directory and immediately `cd` into it.
## Usage
```bash
x-mkd <dir>
```
Set `VERBOSE=1` for status messages.
## Example
```bash
x-mkd project && git init
```
<!-- vim: set ft=markdown spell spelllang=en_us cc=80 : -->

View File

@@ -39,9 +39,9 @@
# Defaults
LOOP=0
SLEEP=1
VERBOSE=0
TIMEOUT=5
# Display usage information and options
usage()
{
echo "Usage: $0 [--loop|--forever] [--sleep=N] hostname1 hostname2 ..."
@@ -60,8 +60,6 @@ while [[ $# -gt 0 ]]; do
exit 0
;;
--verbose)
# shellcheck disable=SC2034
VERBOSE=1
shift
;;
--loop | --forever)

View File

@@ -227,6 +227,9 @@ do_check()
fi
}
# If sourced, provide functions without executing main logic
(return 0 2> /dev/null) && return
#######################################
# Main routine: Parse subcommand and arguments, normalize PATH,
# and dispatch to the appropriate functionality.

View File

@@ -1,44 +1,17 @@
#!/usr/bin/env bash
#
# Optimized script to append directories to PATH.
# For each given directory, it removes all duplicate occurrences from PATH
# and then appends it if the directory exists.
# Thin wrapper — delegates to x-path append.
# Can be sourced (PATH changes propagate) or executed.
#
# Usage: x-path-append <directory1> [<directory2> ...]
#
# Enable verbose output by setting the environment variable VERBOSE=1.
#
# Author: Ismo Vuorinen <https://github.com/ivuorinen> 2024
# License: MIT
VERBOSE="${VERBOSE:-0}"
# Ensure that at least one directory is provided.
[ "$#" -lt 1 ] && {
echo "Usage: $0 <directory> [<directory> ...]"
exit 1
}
# shellcheck source=./x-path
. "$(dirname "${BASH_SOURCE[0]:-$0}")/x-path"
for dir in "$@"; do
# Check if the specified directory exists.
if [ ! -d "$dir" ]; then
[ "$VERBOSE" -eq 1 ] && echo "(?) Directory '$dir' does not exist. Skipping."
continue
fi
# Remove all duplicate occurrences of the directory from PATH.
case ":$PATH:" in
*":$dir:"*)
PATH=":${PATH}:"
PATH="${PATH//:$dir:/:}"
PATH="${PATH#:}"
PATH="${PATH%:}"
[ "$VERBOSE" -eq 1 ] && echo "Removed previous occurrences of '$dir' from PATH."
;;
*) ;;
esac
# Append the directory to PATH.
export PATH="${PATH:+$PATH:}$dir"
[ "$VERBOSE" -eq 1 ] && echo "Appended '$dir' to PATH."
done
normalize_path_var
do_append "$@"

View File

@@ -1,50 +1,17 @@
#!/usr/bin/env bash
#
# Optimized script to batch prepend directories to PATH.
# For each given directory, it removes all duplicate occurrences from PATH
# and then prepends it. Directories that do not exist are skipped.
# Thin wrapper — delegates to x-path prepend.
# Can be sourced (PATH changes propagate) or executed.
#
# Usage: x-path-prepend <directory1> [<directory2> ...]
#
# Enable verbose output by setting the environment variable VERBOSE=1.
#
# Author: Ismo Vuorinen <https://github.com/ivuorinen> 2024
# License: MIT
VERBOSE="${VERBOSE:-0}"
# Ensure that at least one argument is provided.
[ "$#" -lt 1 ] && {
echo "Usage: $0 <directory> [<directory> ...]"
exit 1
}
# shellcheck source=./x-path
. "$(dirname "${BASH_SOURCE[0]:-$0}")/x-path"
# Save the arguments in an array.
dirs=("$@")
# Process the directories in reverse order so that the first argument ends up leftmost in PATH.
for ((idx = ${#dirs[@]} - 1; idx >= 0; idx--)); do
dir="${dirs[idx]}"
# Check if the specified directory exists.
if [ ! -d "$dir" ]; then
[ "$VERBOSE" -eq 1 ] && echo "(?) Directory '$dir' does not exist. Skipping."
continue
fi
# Remove all duplicate occurrences of the directory from PATH using built-in string operations.
case ":$PATH:" in
*":$dir:"*)
PATH=":${PATH}:"
PATH="${PATH//:$dir:/:}"
PATH="${PATH#:}"
PATH="${PATH%:}"
[ "$VERBOSE" -eq 1 ] && echo "Removed duplicate occurrences of '$dir' from PATH."
;;
*) ;;
esac
# Prepend the directory to PATH.
export PATH="$dir${PATH:+":$PATH"}"
[ "$VERBOSE" -eq 1 ] && echo "Prepended '$dir' to PATH."
done
normalize_path_var
do_prepend "$@"

View File

@@ -1,41 +1,17 @@
#!/usr/bin/env bash
#
# Optimized script to remove directories from PATH.
# For each specified directory, all occurrences are removed from PATH.
# Thin wrapper — delegates to x-path remove.
# Can be sourced (PATH changes propagate) or executed.
#
# Usage: x-path-remove <directory1> [<directory2> ...]
#
# Enable verbose output by setting the environment variable VERBOSE=1.
#
# Author: Ismo Vuorinen <https://github.com/ivuorinen> 2024
# License: MIT
VERBOSE="${VERBOSE:-0}"
# Ensure that at least one directory is provided.
[ "$#" -lt 1 ] && {
echo "Usage: $0 <directory> [<directory> ...]"
exit 1
}
# shellcheck source=./x-path
. "$(dirname "${BASH_SOURCE[0]:-$0}")/x-path"
for dir in "$@"; do
# Remove trailing slash if present, unless the directory is "/"
[ "$dir" != "/" ] && dir="${dir%/}"
# Check if the directory is present in PATH.
case ":$PATH:" in
*":$dir:"*)
# Remove all occurrences of the directory from PATH using parameter expansion.
PATH=":${PATH}:"
PATH="${PATH//:$dir:/:}"
PATH="${PATH#:}"
PATH="${PATH%:}"
[ "$VERBOSE" -eq 1 ] && echo "Removed '$dir' from PATH."
;;
*)
[ "$VERBOSE" -eq 1 ] && echo "(?) '$dir' is not in PATH."
;;
esac
done
export PATH
normalize_path_var
do_remove "$@"

View File

@@ -39,16 +39,19 @@ log_error()
{
echo -e "${RED}ERROR:${NC} $1" >&2
}
# Log a warning message
log_warn()
{
echo -e "${YELLOW}WARN:${NC} $1" >&2
}
# Log an informational message
log_info()
{
if [[ "${INFO:-0}" == "1" ]]; then
echo -e "${GREEN}INFO:${NC} $1" >&2
fi
}
# Log a debug message
log_debug()
{
if [[ "${DEBUG:-0}" == "1" ]]; then

View File

@@ -35,7 +35,7 @@ msg()
# Notify function
notify()
{
notify-call --replace-file "$replace_id" "$@"
notify-send.sh --replace-file "$replace_id" "$@"
}
# Stop recording function

626
local/bin/x-sonarcloud Executable file
View File

@@ -0,0 +1,626 @@
#!/usr/bin/env bash
# x-sonarcloud - Fetch SonarCloud issues for LLM analysis
# Copyright (c) 2025 - Licensed under MIT
#
# Usage:
# x-sonarcloud # Auto-detect, all open issues
# x-sonarcloud --pr <number> # PR-specific issues
# x-sonarcloud --branch <name> # Branch-specific issues
# x-sonarcloud --org <org> --project-key <key> # Explicit project
# x-sonarcloud --severities BLOCKER,CRITICAL # Filter by severity
# x-sonarcloud --types BUG,VULNERABILITY # Filter by type
# x-sonarcloud --statuses OPEN,CONFIRMED # Filter by status
# x-sonarcloud --resolved # Include resolved issues
# x-sonarcloud -h|--help # Show this help
#
# Examples:
# x-sonarcloud # All open issues in project
# x-sonarcloud --pr 42 # Issues on PR #42
# x-sonarcloud --branch main # Issues on main branch
# x-sonarcloud --severities BLOCKER --types BUG # Only blocker bugs
#
# Requirements:
# - curl and jq installed
# - SONAR_TOKEN environment variable set
# - sonar-project.properties or .sonarlint/connectedMode.json for auto-detection
set -euo pipefail
# Colors for output (stderr only)
readonly RED='\033[0;31m'
readonly GREEN='\033[0;32m'
readonly YELLOW='\033[1;33m'
readonly BLUE='\033[0;34m'
readonly NC='\033[0m' # No Color
# API constants
readonly MAX_PAGE_SIZE=500
readonly MAX_TOTAL_ISSUES=10000
# Show usage information
show_usage()
{
sed -n '3,27p' "$0" | sed 's/^# //' | sed 's/^#//'
}
# Log functions
log_error()
{
echo -e "${RED}ERROR:${NC} $1" >&2
}
# Log a warning message
log_warn()
{
echo -e "${YELLOW}WARN:${NC} $1" >&2
}
# Log an informational message
log_info()
{
if [[ "${INFO:-0}" == "1" ]]; then
echo -e "${GREEN}INFO:${NC} $1" >&2
fi
}
# Log a debug message
log_debug()
{
if [[ "${DEBUG:-0}" == "1" ]]; then
echo -e "${BLUE}DEBUG:${NC} $1" >&2
fi
}
# Check required dependencies
check_dependencies()
{
local missing=0
if ! command -v curl &> /dev/null; then
log_error "curl is not installed. Install it with your package manager."
missing=1
fi
if ! command -v jq &> /dev/null; then
log_error "jq is not installed. Install it with your package manager:"
log_error " https://jqlang.github.io/jq/download/"
missing=1
fi
if [[ "$missing" -eq 1 ]]; then
exit 1
fi
}
# Check authentication
check_auth()
{
if [[ -z "${SONAR_TOKEN:-}" ]]; then
log_error "SONAR_TOKEN environment variable is not set."
log_error "Generate a token at: https://sonarcloud.io/account/security"
log_error "Then export it: export SONAR_TOKEN=your_token_here"
exit 1
fi
}
# Detect project from sonar-project.properties
detect_project_from_properties()
{
local props_file="sonar-project.properties"
if [[ ! -f "$props_file" ]]; then
return 1
fi
local org key
org=$(grep -E '^sonar\.organization=' "$props_file" 2> /dev/null | cut -d'=' -f2- || echo "")
key=$(grep -E '^sonar\.projectKey=' "$props_file" 2> /dev/null | cut -d'=' -f2- || echo "")
if [[ -n "$org" && -n "$key" ]]; then
log_debug "Detected from sonar-project.properties: org=$org key=$key"
echo "$org" "$key" ""
return 0
fi
return 1
}
# Detect project from .sonarlint/connectedMode.json
detect_project_from_sonarlint()
{
local sonarlint_file=".sonarlint/connectedMode.json"
if [[ ! -f "$sonarlint_file" ]]; then
return 1
fi
local org key region
org=$(jq -r '.sonarCloudOrganization // empty' "$sonarlint_file" 2> /dev/null || echo "")
key=$(jq -r '.projectKey // empty' "$sonarlint_file" 2> /dev/null || echo "")
region=$(jq -r '.region // empty' "$sonarlint_file" 2> /dev/null || echo "")
if [[ -n "$org" && -n "$key" ]]; then
log_debug "Detected from .sonarlint/connectedMode.json: org=$org key=$key region=$region"
echo "$org" "$key" "$region"
return 0
fi
return 1
}
# Orchestrate project detection in priority order
detect_project()
{
local result
# 1. sonar-project.properties
if result=$(detect_project_from_properties); then
echo "$result"
return 0
fi
# 2. .sonarlint/connectedMode.json
if result=$(detect_project_from_sonarlint); then
echo "$result"
return 0
fi
# No config found
log_error "Could not auto-detect SonarCloud project configuration."
log_error "Provide one of the following:"
log_error " 1. sonar-project.properties with sonar.organization and sonar.projectKey"
log_error " 2. .sonarlint/connectedMode.json with sonarCloudOrganization and projectKey"
log_error " 3. CLI flags: --org <org> --project-key <key>"
return 1
}
# Get API base URL (currently same for all regions)
get_base_url()
{
echo "https://sonarcloud.io"
}
# Make an authenticated SonarCloud API request
sonar_api_request()
{
local url="$1"
log_debug "API request: $url"
local http_code body response
response=$(curl -s -w "\n%{http_code}" \
-H "Authorization: Bearer $SONAR_TOKEN" \
"$url" 2> /dev/null) || {
log_error "curl request failed for: $url"
return 1
}
http_code=$(echo "$response" | tail -n1)
body=$(echo "$response" | sed '$d')
case "$http_code" in
200)
echo "$body"
return 0
;;
401)
log_error "Authentication failed (HTTP 401). Check your SONAR_TOKEN."
return 1
;;
403)
log_error "Access forbidden (HTTP 403). Token may lack required permissions."
return 1
;;
404)
log_error "Not found (HTTP 404). Check organization and project key."
return 1
;;
429)
log_error "Rate limited (HTTP 429). Wait before retrying."
return 1
;;
*)
log_error "API request failed with HTTP $http_code"
log_debug "Response body: $body"
return 1
;;
esac
}
# Fetch a single page of issues
fetch_issues_page()
{
local base_url="$1"
local project_key="$2"
local page="$3"
local pr_number="${4:-}"
local branch="${5:-}"
local severities="${6:-}"
local types="${7:-}"
local statuses="${8:-}"
local resolved="${9:-}"
local url="${base_url}/api/issues/search?componentKeys=${project_key}"
url="${url}&p=${page}&ps=${MAX_PAGE_SIZE}"
if [[ -n "$pr_number" ]]; then
url="${url}&pullRequest=${pr_number}"
fi
if [[ -n "$branch" ]]; then
url="${url}&branch=${branch}"
fi
if [[ -n "$severities" ]]; then
url="${url}&severities=${severities}"
fi
if [[ -n "$types" ]]; then
url="${url}&types=${types}"
fi
if [[ -n "$statuses" ]]; then
url="${url}&statuses=${statuses}"
fi
if [[ -n "$resolved" ]]; then
url="${url}&resolved=${resolved}"
fi
sonar_api_request "$url"
}
# Fetch all issues with pagination
fetch_all_issues()
{
local base_url="$1"
local project_key="$2"
local pr_number="${3:-}"
local branch="${4:-}"
local severities="${5:-}"
local types="${6:-}"
local statuses="${7:-}"
local resolved="${8:-}"
local page=1
local all_issues="[]"
local total=0
while true; do
log_info "Fetching issues page $page..."
local response
response=$(fetch_issues_page "$base_url" "$project_key" "$page" \
"$pr_number" "$branch" "$severities" "$types" "$statuses" "$resolved") || return 1
local page_issues page_total
page_issues=$(echo "$response" | jq '.issues // []' 2> /dev/null || echo "[]")
page_total=$(echo "$response" | jq '.total // 0' 2> /dev/null || echo "0")
local page_count
page_count=$(echo "$page_issues" | jq 'length' 2> /dev/null || echo "0")
log_debug "Page $page: $page_count issues (total available: $page_total)"
# Merge into accumulated results
all_issues=$(echo "$all_issues" "$page_issues" | jq -s '.[0] + .[1]' 2> /dev/null || echo "$all_issues")
total=$(echo "$all_issues" | jq 'length' 2> /dev/null || echo "0")
# Check if we have all issues or hit the cap
if [[ "$page_count" -lt "$MAX_PAGE_SIZE" ]]; then
break
fi
if [[ "$total" -ge "$MAX_TOTAL_ISSUES" ]]; then
log_warn "Reached maximum of $MAX_TOTAL_ISSUES issues. Results may be incomplete."
break
fi
page=$((page + 1))
done
log_info "Fetched $total issues total"
echo "$all_issues"
}
# Format issues grouped by severity then by file
format_issues_by_severity()
{
local issues="$1"
local base_url="$2"
local org="$3"
local project_key="$4"
echo "$issues" | jq -r --arg base_url "$base_url" --arg org "$org" --arg key "$project_key" '
group_by(.severity) | sort_by(-(
if .[0].severity == "BLOCKER" then 5
elif .[0].severity == "CRITICAL" then 4
elif .[0].severity == "MAJOR" then 3
elif .[0].severity == "MINOR" then 2
elif .[0].severity == "INFO" then 1
else 0 end
)) | .[] |
"### Severity: \(.[0].severity)\n" +
(
group_by(.component) | .[] |
"#### File: \(.[0].component | split(":") | if length > 1 then .[1:] | join(":") else .[0] end)\n" +
(
[.[] |
"##### Issue: \(.message)\n" +
"- **Rule:** \(.rule)\n" +
"- **Type:** \(.type)\n" +
"- **Severity:** \(.severity)\n" +
"- **Status:** \(.status)\n" +
"- **Line:** \(.line // "N/A")\n" +
"- **Effort:** \(.effort // "N/A")\n" +
"- **Created:** \(.creationDate // "N/A")\n" +
"- **URL:** \($base_url)/project/issues?open=\(.key)&id=\($key)\n"
] | join("\n")
)
)
' 2> /dev/null || echo "Error formatting issues."
}
# Format summary counts
format_summary()
{
local issues="$1"
echo "### By Severity"
echo ""
echo "$issues" | jq -r '
group_by(.severity) | .[] |
"- **\(.[0].severity):** \(length)"
' 2> /dev/null || echo "- Error computing severity counts"
echo ""
echo "### By Type"
echo ""
echo "$issues" | jq -r '
group_by(.type) | .[] |
"- **\(.[0].type):** \(length)"
' 2> /dev/null || echo "- Error computing type counts"
echo ""
echo "### Total"
echo ""
local count
count=$(echo "$issues" | jq 'length' 2> /dev/null || echo "0")
echo "- **Total issues:** $count"
}
# Format the full markdown output
format_output()
{
local org="$1"
local project_key="$2"
local mode="$3"
local mode_value="$4"
local base_url="$5"
local issues="$6"
local issue_count
issue_count=$(echo "$issues" | jq 'length' 2> /dev/null || echo "0")
# Header and LLM instructions
cat << 'EOF'
# SonarCloud Issues Analysis Report
## LLM Processing Instructions
You are analyzing code quality issues from SonarCloud for this project.
**Your tasks:**
1. **Triage**: Review each issue and assess its real impact on the codebase
2. **Priority Assessment**: Rank issues by severity and likelihood of causing problems
3. **Code Verification**: Check the actual source code to confirm each issue is valid
4. **Root Cause Analysis**: Identify why the issue exists and what pattern caused it
5. **Implementation Plan**: Create actionable fix tasks grouped by file for efficiency
6. **False Positive Detection**: Flag issues that appear to be false positives with reasoning
**Tools to use:**
- `find`, `cat`, `rg` commands and available tools to examine current codebase
- `git log` and `git blame` to understand code history and authorship
- File system tools to verify mentioned files exist and check current state
EOF
# Project information
cat << EOF
## Project Information
- **Organization:** $org
- **Project Key:** $project_key
EOF
case "$mode" in
pr)
echo "- **Mode:** Pull Request #$mode_value"
echo "- **URL:** ${base_url}/project/issues?pullRequest=${mode_value}&id=${project_key}"
;;
branch)
echo "- **Mode:** Branch \`$mode_value\`"
echo "- **URL:** ${base_url}/project/issues?branch=${mode_value}&id=${project_key}"
;;
*)
echo "- **Mode:** Project (all open issues)"
echo "- **URL:** ${base_url}/project/issues?id=${project_key}"
;;
esac
echo "- **Dashboard:** ${base_url}/project/overview?id=${project_key}"
# Issues section
echo ""
echo "## Issues ($issue_count total)"
echo ""
if [[ "$issue_count" -eq 0 ]]; then
echo "No issues found matching the specified filters."
else
format_issues_by_severity "$issues" "$base_url" "$org" "$project_key"
echo ""
echo "## Summary"
echo ""
format_summary "$issues"
fi
# Footer
cat << 'EOF'
## Next Steps for LLM Analysis
1. **Validate against current code:**
- Check if mentioned files and lines still match the reported issues
- Verify issues are not already fixed in the current branch
- Identify false positives and explain why they are false positives
2. **Prioritize fixes:**
- Address BLOCKER and CRITICAL severity issues first
- Group fixes by file to minimize context switching
- Consider effort estimates when planning the fix order
3. **Group by file for implementation:**
- Batch changes to the same file together
- Consider dependencies between fixes
- Plan atomic commits per logical change group
4. **Track progress:**
- Use todo lists and memory tools to track which issues are addressed
- Mark false positives with clear reasoning
- Verify fixes do not introduce new issues
EOF
}
# Main pipeline: fetch and display issues
fetch_and_display_issues()
{
local org="$1"
local project_key="$2"
local mode="$3"
local mode_value="$4"
local severities="${5:-}"
local types="${6:-}"
local statuses="${7:-}"
local resolved="${8:-}"
local base_url
base_url=$(get_base_url)
local pr_number=""
local branch=""
case "$mode" in
pr)
pr_number="$mode_value"
;;
branch)
branch="$mode_value"
;;
esac
log_info "Fetching SonarCloud issues for $project_key (mode: $mode)..."
local issues
issues=$(fetch_all_issues "$base_url" "$project_key" \
"$pr_number" "$branch" "$severities" "$types" "$statuses" "$resolved") || {
log_error "Failed to fetch issues"
return 1
}
format_output "$org" "$project_key" "$mode" "$mode_value" "$base_url" "$issues"
}
# Main function
main()
{
local org=""
local project_key=""
local mode="project"
local mode_value=""
local severities=""
local types=""
local statuses="OPEN,CONFIRMED,REOPENED"
local resolved="false"
# Parse arguments
while [[ $# -gt 0 ]]; do
case "$1" in
-h | --help)
show_usage
exit 0
;;
--pr)
mode="pr"
mode_value="${2:?Missing PR number after --pr}"
shift 2
;;
--branch)
mode="branch"
mode_value="${2:?Missing branch name after --branch}"
shift 2
;;
--org)
org="${2:?Missing organization after --org}"
shift 2
;;
--project-key)
project_key="${2:?Missing project key after --project-key}"
shift 2
;;
--severities)
severities="${2:?Missing severities after --severities}"
shift 2
;;
--types)
types="${2:?Missing types after --types}"
shift 2
;;
--statuses)
statuses="${2:?Missing statuses after --statuses}"
shift 2
;;
--resolved)
resolved="true"
statuses=""
shift
;;
*)
log_error "Unknown argument: $1"
show_usage
exit 1
;;
esac
done
check_dependencies
check_auth
# Auto-detect project if not specified via CLI
if [[ -z "$org" || -z "$project_key" ]]; then
local detected
detected=$(detect_project) || exit 1
# shellcheck disable=SC2034 # region reserved for future per-region base URLs
read -r detected_org detected_key detected_region <<< "$detected"
if [[ -z "$org" ]]; then
org="$detected_org"
fi
if [[ -z "$project_key" ]]; then
project_key="$detected_key"
fi
fi
log_debug "Organization: $org"
log_debug "Project Key: $project_key"
log_debug "Mode: $mode"
log_debug "Severities: ${severities:-all}"
log_debug "Types: ${types:-all}"
log_debug "Statuses: ${statuses:-all}"
log_debug "Resolved: $resolved"
fetch_and_display_issues "$org" "$project_key" "$mode" "$mode_value" \
"$severities" "$types" "$statuses" "$resolved"
}
# Run main function with all arguments
main "$@"

46
local/bin/x-sonarcloud.md Normal file
View File

@@ -0,0 +1,46 @@
# x-sonarcloud
---
## Usage
```bash
x-sonarcloud # Auto-detect, all open issues
x-sonarcloud --pr <number> # PR-specific issues
x-sonarcloud --branch <name> # Branch-specific issues
x-sonarcloud --org <org> --project-key <key> # Explicit project
x-sonarcloud --severities BLOCKER,CRITICAL # Filter by severity
x-sonarcloud --types BUG,VULNERABILITY # Filter by type
x-sonarcloud --statuses OPEN,CONFIRMED # Filter by status
x-sonarcloud --resolved # Include resolved issues
x-sonarcloud -h|--help # Show help
```
Fetches SonarCloud code quality issues via REST API and formats them as
structured markdown with LLM processing instructions for automated analysis
and triage.
## Examples
```bash
x-sonarcloud # All open issues in project
x-sonarcloud --pr 42 # Issues on PR #42
x-sonarcloud --branch main # Issues on main branch
x-sonarcloud --severities BLOCKER --types BUG # Only blocker bugs
```
## Requirements
- `curl` and `jq` installed
- `SONAR_TOKEN` environment variable set
(generate at <https://sonarcloud.io/account/security>)
- Project auto-detection via `sonar-project.properties` or
`.sonarlint/connectedMode.json`, or explicit `--org`/`--project-key` flags
## Environment Variables
- `SONAR_TOKEN` — Bearer token for SonarCloud API authentication (required)
- `INFO=1` — Enable informational log messages on stderr
- `DEBUG=1` — Enable debug log messages on stderr
<!-- vim: set ft=markdown spell spelllang=en_us cc=80 : -->

View File

@@ -154,6 +154,7 @@ get_state()
# ERROR HANDLING AND CLEANUP
# ============================================================================
# Clean up temporary files and handle exit
cleanup()
{
exit_code=$?
@@ -177,6 +178,7 @@ trap cleanup EXIT INT TERM
# LOGGING FUNCTIONS
# ============================================================================
# Create audit directories and initialize log file
setup_logging()
{
# Create all necessary directories
@@ -197,6 +199,7 @@ setup_logging()
} >> "$LOG_FILE"
}
# Log a message with timestamp and severity level
log_message()
{
level="$1"
@@ -225,6 +228,7 @@ log_message()
# INPUT VALIDATION
# ============================================================================
# Validate hostname format for SSH connection
validate_hostname()
{
hostname="$1"
@@ -244,6 +248,7 @@ validate_hostname()
return 0
}
# Validate username format for SSH connection
validate_username()
{
username="$1"
@@ -263,6 +268,7 @@ validate_username()
return 0
}
# Parse input file into validated host entries
parse_host_list()
{
input_file="$1"
@@ -309,6 +315,7 @@ parse_host_list()
# SSH CONNECTION FUNCTIONS
# ============================================================================
# Execute SSH command with retry logic and key fallback
ssh_with_retry()
{
host="$1"
@@ -373,6 +380,7 @@ ssh_with_retry()
return 1
}
# Verify SSH connectivity to a host
test_ssh_connectivity()
{
host="$1"
@@ -392,6 +400,7 @@ test_ssh_connectivity()
# SSH SECURITY AUDIT FUNCTIONS
# ============================================================================
# Audit SSH daemon configuration on a remote host
check_sshd_config()
{
host="$1"
@@ -451,6 +460,7 @@ check_sshd_config()
# AUTOMATED UPDATES DETECTION
# ============================================================================
# Check if automated security updates are enabled
check_automated_updates()
{
host="$1"
@@ -532,6 +542,7 @@ check_automated_updates()
# PENDING REBOOT DETECTION
# ============================================================================
# Detect if a remote host requires a reboot
check_pending_reboot()
{
host="$1"
@@ -602,6 +613,7 @@ check_pending_reboot()
# REMEDIATION FUNCTIONS
# ============================================================================
# Create a timestamped backup of sshd_config
backup_sshd_config()
{
host="$1"
@@ -616,6 +628,7 @@ backup_sshd_config()
" "$ssh_key"
}
# Disable password authentication on a remote host
disable_password_auth()
{
host="$1"
@@ -668,6 +681,7 @@ ClientAliveCountMax 2
# REPORTING FUNCTIONS
# ============================================================================
# Generate CSV report from audit results
generate_csv_report()
{
report_file="$1"
@@ -693,6 +707,7 @@ generate_csv_report()
done < "$HOSTS_LIST_FILE"
}
# Display formatted audit summary to terminal
display_summary()
{
printf '\n'
@@ -743,6 +758,7 @@ display_summary()
# MAIN AUDIT FUNCTION
# ============================================================================
# Run all audit checks on a single host
audit_host()
{
host_entry="$1"
@@ -788,6 +804,7 @@ audit_host()
# MAIN EXECUTION
# ============================================================================
# Main entry point: parse args, run audits, generate report
main()
{
input_file="${1:-}"

View File

@@ -37,7 +37,7 @@ The script automatically tries authentication methods in this order:
1. **Specific key** (if provided in host file)
2. **Auto-detected default keys** (`~/.ssh/id_ed25519`, `id_rsa`, `id_ecdsa`,
`id_dsa`)
`id_dsa`)
3. **SSH agent or system default authentication**
This means you can mix hosts with and without specific keys, and the script will
@@ -178,7 +178,7 @@ SSH_RETRIES=3
3. **Staged Rollout**: Test on non-critical hosts first
4. **Review Logs**: Check log files for detailed information
5. **Preserve Access**: Script ensures key-based auth works before disabling
passwords
passwords
## Version

View File

@@ -1,17 +1,21 @@
#!/usr/bin/env bash
#
# This file echoes a bunch of 24-bit color codes
# to the terminal to demonstrate its functionality.
# The foreground escape sequence is ^[38;2;<r>;<g>;<b>m
# The background escape sequence is ^[48;2;<r>;<g>;<b>m
# <r> <g> <b> range from 0 to 255 inclusive.
# The escape sequence ^[0m returns output to default
# Display 24-bit terminal color test.
#
# Usage: x-term-colors
#
# The foreground escape sequence is ^[38;2;<r>;<g>;<b>m
# The background escape sequence is ^[48;2;<r>;<g>;<b>m
# <r> <g> <b> range from 0 to 255 inclusive.
# The escape sequence ^[0m returns output to default
# Set terminal background to an RGB color
setBackgroundColor()
{
echo -en "\x1b[48;2;$1;$2;$3""m"
}
# Reset terminal output formatting
resetOutput()
{
echo -en "\x1b[0m\n"

View File

@@ -28,6 +28,7 @@
set -euo pipefail
# Display usage information and options
usage()
{
cat << EOF
@@ -52,6 +53,7 @@ THUMB_SUFFIX="${THUMB_SUFFIX:-_thumb}"
# List of MIME types supported by ImageMagick (adjust as needed)
ALLOWED_MIMETYPES=("image/jpeg" "image/png" "image/gif" "image/bmp" "image/tiff" "image/webp")
# Verify ImageMagick is available
check_magick_installed()
{
if ! command -v magick &> /dev/null; then
@@ -60,6 +62,7 @@ check_magick_installed()
fi
}
# Verify mimetype command is available
check_mimetype_installed()
{
if ! command -v mimetype &> /dev/null; then
@@ -165,6 +168,7 @@ generate_thumbnails()
done < <(find "$source_dir" -type f -print0)
}
# Parse options, validate inputs, and generate thumbnails
main()
{
parse_options "$@"

View File

@@ -1,68 +0,0 @@
#!/usr/bin/env bash
#
# This script contains a helper for sha256 validating your downloads
#
# Source: https://gist.github.com/onnimonni/b49779ebc96216771a6be3de46449fa1
# Author: Onni Hakala
# License: MIT
#
# Updated by Ismo Vuorinen <https://github.com/ivuorinen> 2022
##
set -euo pipefail
# Stop program and give error message
# $1 - error message (string)
error()
{
echo "(!) ERROR: $1" >&2
exit 1
}
# Check for sha256sum command
if ! command -v sha256sum &> /dev/null; then
error "sha256sum could not be found, please install it first"
fi
# Return sha256sum for file
# $1 - filename (string)
get_sha256sum()
{
sha256sum "$1" | head -c 64
}
# Validate input arguments
validate_inputs()
{
if [ -z "${filename:-}" ]; then
error "You need to provide filename as the first parameter"
fi
if [ -z "${file_hash:-}" ]; then
error "You need to provide sha256sum as the second parameter"
fi
}
# Main validation logic
validate_file()
{
if [ ! -f "$filename" ]; then
error "File $filename doesn't exist"
elif [ "$(get_sha256sum "$filename")" = "$file_hash" ]; then
echo "(*) Success: $filename matches provided sha256sum"
else
error "$filename doesn't match provided sha256sum"
fi
}
# Main function
main()
{
filename=$1
file_hash=$2
validate_inputs
validate_file
}
main "$@"

View File

@@ -1,14 +0,0 @@
# x-validate-sha256sum.sh
This script contains a helper for sha256 validating your downloads
## Usage
```bash
x-validate-sha256sum.sh file sha256sum
```
The script computes the SHA256 hash of `file` and compares it to the
expected value. It exits non-zero if the sums differ.
<!-- vim: set ft=markdown spell spelllang=en_us cc=80 : -->

View File

@@ -1,4 +1,5 @@
#!/usr/bin/env bash
set -euo pipefail
#
# Wait until a given host is down (determined by ping) then execute the
# given command
@@ -25,6 +26,7 @@ if [ "$#" -lt 2 ]; then
exit 1
fi
# Wait until host stops responding to ping
wait_for_host_down()
{
local host=$1
@@ -36,6 +38,7 @@ wait_for_host_down()
done
}
# Wait for host to go down then execute command
main()
{
local host=$1

View File

@@ -1,4 +1,5 @@
#!/usr/bin/env bash
set -euo pipefail
#
# Wait until a given host is online (determined by ping) then execute the
# given command
@@ -29,6 +30,7 @@ if [ "$#" -lt 2 ]; then
exit 1
fi
# Extract hostname from arguments, handling ssh shortcut
get_host()
{
if [ "$1" = "ssh" ]; then
@@ -38,6 +40,7 @@ get_host()
fi
}
# Wait until host responds to ping
wait_for_host()
{
local host=$1
@@ -49,6 +52,7 @@ wait_for_host()
done
}
# Wait for host to come online then execute command
main()
{
local host

View File

@@ -9,10 +9,15 @@
"lint:biome": "biome check .",
"fix:biome": "biome check --write .",
"format": "biome format --write .",
"lint:prettier": "prettier --check '**/*.{yml,yaml}'",
"fix:prettier": "prettier --write '**/*.{yml,yaml}'",
"format:yaml": "prettier --write '**/*.{yml,yaml}'",
"test": "bash test-all.sh",
"lint:ec": "ec -f gcc",
"lint": "yarn lint:biome && yarn lint:ec",
"fix": "yarn fix:biome"
"lint:md-table": "git ls-files '*.md' | xargs markdown-table-formatter --check",
"fix:md-table": "git ls-files '*.md' | xargs markdown-table-formatter",
"lint": "yarn lint:biome && yarn lint:prettier && yarn lint:ec && yarn lint:md-table",
"fix": "yarn fix:biome && yarn fix:prettier && yarn fix:md-table"
},
"repository": {
"type": "git",
@@ -33,6 +38,8 @@
"@types/node": "^24.0.1",
"bats": "^1.12.0",
"editorconfig-checker": "^6.1.0",
"markdown-table-formatter": "^1.7.0",
"prettier": "^3.8.1",
"typescript": "^5.8.3"
},
"packageManager": "yarn@4.12.0"

9
pyproject.toml Normal file
View File

@@ -0,0 +1,9 @@
[tool.ruff]
target-version = "py39"
line-length = 120
[tool.ruff.lint]
select = ["E", "F", "W", "I", "UP", "B", "SIM", "C4"]
[tool.ruff.format]
quote-style = "double"

View File

@@ -1,4 +1,5 @@
#!/usr/bin/env bash
set -euo pipefail
# @description Create file containing key mappings for Neovim
# Usage: ./create-nvim-keymaps.sh
#
@@ -6,6 +7,7 @@
source "${DOTFILES}/config/shared.sh"
DEST="$HOME/.dotfiles/docs/nvim-keybindings.md"
# Generate Neovim keybindings documentation
main()
{
msg "Generating Neovim keybindings documentation"
@@ -15,7 +17,7 @@ main()
printf "\`\`\`txt"
} > "$DEST"
nvim -c "redir! >> $DEST" -c 'silent verbose map' -c 'redir END' -c 'q'
nvim -c "redir! >> \"$DEST\"" -c 'silent verbose map' -c 'redir END' -c 'q'
printf "\n\`\`\`\n\n- Generated on %s\n" "$(date)" >> "$DEST"
@@ -27,6 +29,7 @@ main()
&& mv "${DEST}.tmp" "$DEST"
msg "Neovim keybindings documentation generated at $DEST"
return 0
}
main "$@"

View File

@@ -6,20 +6,30 @@
source "${DOTFILES}/config/shared.sh"
DEST="$HOME/.dotfiles/docs/wezterm-keybindings.md"
# Generate wezterm keybindings documentation
main()
{
msg "Generating wezterm keybindings documentation"
local tmp
tmp="$(mktemp)"
trap 'rm -f "$tmp"' RETURN
{
printf "# wezterm keybindings\n\n"
printf "\`\`\`txt\n"
} > "$DEST"
} > "$tmp"
wezterm show-keys >> "$DEST"
if ! wezterm show-keys >> "$tmp"; then
msg "Failed to run 'wezterm show-keys'"
return 1
fi
printf "\`\`\`\n\n- Generated on %s\n" "$(date)" >> "$DEST"
printf "\`\`\`\n\n- Generated on %s\n" "$(date)" >> "$tmp"
mv "$tmp" "$DEST"
msg "wezterm keybindings documentation generated at $DEST"
return 0
}
main "$@"

79
scripts/install-apt-packages.sh Executable file
View File

@@ -0,0 +1,79 @@
#!/usr/bin/env bash
set -euo pipefail
# @description Install essential apt packages for development.
#
# shellcheck source=shared.sh
source "$DOTFILES/config/shared.sh"
msgr run "Starting to install apt packages"
if ! command -v apt &> /dev/null; then
msgr warn "apt not found (not a Debian-based system)"
exit 0
fi
packages=(
# Build essentials
build-essential # gcc, g++, make
cmake # Cross-platform build system
pkg-config # Helper for compiling against libraries
autoconf # Automatic configure script builder
automake # Makefile generator
libtool # Generic library support script
# Libraries for compiling languages
libssl-dev # SSL development headers
libffi-dev # Foreign function interface
zlib1g-dev # Compression library
libreadline-dev # Command-line editing
libbz2-dev # Bzip2 compression
libsqlite3-dev # SQLite database
libncurses-dev # Terminal UI library
# CLI utilities (not in cargo/go/npm)
jq # JSON processor
tmux # Terminal multiplexer
tree # Directory listing
unzip # Archive extraction
shellcheck # Shell script linter
socat # Multipurpose network relay
gnupg # GPG encryption/signing
software-properties-common # add-apt-repository command
)
# Install apt packages that are not already present
install_packages()
{
local to_install=()
for pkg in "${packages[@]}"; do
pkg="${pkg%%#*}"
pkg="${pkg// /}"
[[ -z "$pkg" ]] && continue
if dpkg -s "$pkg" &> /dev/null; then
msgr ok "$pkg already installed"
else
to_install+=("$pkg")
fi
done
if [[ ${#to_install[@]} -gt 0 ]]; then
msgr run "Installing ${#to_install[@]} packages: ${to_install[*]}"
sudo apt update
sudo apt install -y "${to_install[@]}"
else
msgr ok "All packages already installed"
fi
return 0
}
# Install all apt packages and report completion
main()
{
install_packages
msgr yay "apt package installations complete"
return 0
}
main "$@"

View File

@@ -11,10 +11,10 @@ scripts/install-cargo-packages.sh
## What it does
1. If `cargo-install-update` is available, updates all existing packages first
and tracks which packages are already installed.
and tracks which packages are already installed.
2. Installs each package from the inline list using `cargo install`,
skipping any already handled by the update step.
Builds run in parallel using available CPU cores (minus two).
skipping any already handled by the update step.
Builds run in parallel using available CPU cores (minus two).
3. Runs package-specific post-install steps.
4. Cleans the cargo cache with `cargo cache --autoclean`.

View File

@@ -1,5 +1,9 @@
#!/usr/bin/env bash
set -euo pipefail
# @description Install cargo/rust packages.
#
# shellcheck source=shared.sh
source "$DOTFILES/config/shared.sh"
msgr run "Starting to install rust/cargo packages"
@@ -21,18 +25,18 @@ fi
# Cargo packages to install
packages=(
cargo-update # A cargo subcommand for checking and applying updates to installed executables
cargo-cache # Cargo cache management utility
tree-sitter-cli # An incremental parsing system for programming tools
bkt # A subprocess caching utility
difftastic # A structural diff that understands syntax
fd-find # A simple, fast and user-friendly alternative to 'find'
ripgrep # Recursively searches directories for a regex pattern while respecting your gitignore
bob-nvim # A version manager for neovim
bottom # A cross-platform graphical process/system monitor
eza # A modern alternative to ls
tmux-sessionizer # A tool for opening git repositories as tmux sessions
zoxide # A smarter cd command
cargo-update # A cargo subcommand for checking and applying updates to installed executables
cargo-cache # Cargo cache management utility
tree-sitter-cli # An incremental parsing system for programming tools
bkt # A subprocess caching utility
difftastic # A structural diff that understands syntax
fd-find # A simple, fast and user-friendly alternative to 'find'
ripgrep # Recursively searches directories for a regex pattern while respecting your gitignore
bob-nvim # A version manager for neovim
bottom # A cross-platform graphical process/system monitor
eza # A modern alternative to ls
tmux-sessionizer # A tool for opening git repositories as tmux sessions
zoxide # A smarter cd command
)
# Number of jobs to run in parallel, this helps to keep the system responsive
@@ -53,6 +57,7 @@ install_packages()
msgr run_done "Done installing $pkg"
echo ""
done
return 0
}
# Function to perform additional steps for installed cargo packages
@@ -68,13 +73,16 @@ post_install_steps()
msgr run "Removing cargo cache"
cargo cache --autoclean
msgr "done" "Done removing cargo cache"
return 0
}
# Install cargo packages and run post-install steps
main()
{
install_packages
msgr "done" "Installed cargo packages!"
post_install_steps
return 0
}
main "$@"

Some files were not shown because too many files have changed in this diff Show More