mirror of
https://github.com/ivuorinen/actions.git
synced 2026-01-26 11:34:00 +00:00
feat: fixes, tweaks, new actions, linting (#186)
* feat: fixes, tweaks, new actions, linting * fix: improve docker publish loops and dotnet parsing (#193) * fix: harden action scripts and version checks (#191) * refactor: major repository restructuring and security enhancements Add comprehensive development infrastructure: - Add Makefile with automated documentation generation, formatting, and linting tasks - Add TODO.md tracking self-containment progress and repository improvements - Add .nvmrc for consistent Node.js version management - Create python-version-detect-v2 action for enhanced Python detection Enhance all GitHub Actions with standardized patterns: - Add consistent token handling across 27 actions using standardized input patterns - Implement bash error handling (set -euo pipefail) in all shell steps - Add comprehensive input validation for path traversal and command injection protection - Standardize checkout token authentication to prevent rate limiting - Remove relative action dependencies to ensure external usability Rewrite security workflow for PR-focused analysis: - Transform security-suite.yml to PR-only security analysis workflow - Remove scheduled runs, repository issue management, and Slack notifications - Implement smart comment generation showing only sections with content - Add GitHub Actions permission diff analysis and new action detection - Integrate OWASP, Semgrep, and TruffleHog for comprehensive PR security scanning Improve version detection and dependency management: - Simplify version detection actions to use inline logic instead of shared utilities - Fix Makefile version detection fallback to properly return 'main' when version not found - Update all external action references to use SHA-pinned versions - Remove deprecated run.sh in favor of Makefile automation Update documentation and project standards: - Enhance CLAUDE.md with self-containment requirements and linting standards - Update README.md with improved action descriptions and usage examples - Standardize code formatting with updated .editorconfig and .prettierrc.yml - Improve GitHub templates for issues and security reporting This refactoring ensures all 40 actions are fully self-contained and can be used independently when referenced as ivuorinen/actions/action-name@main, addressing the critical requirement for external usability while maintaining comprehensive security analysis and development automation. * feat: add automated action catalog generation system - Create generate_listing.cjs script for comprehensive action catalog - Add package.json with development tooling and npm scripts - Implement automated README.md catalog section with --update flag - Generate markdown reference-style links for all 40 actions - Add categorized tables with features, language support matrices - Replace static reference links with auto-generated dynamic links - Enable complete automation of action documentation maintenance * feat: enhance actions with improved documentation and functionality - Add comprehensive README files for 12 actions with usage examples - Implement new utility actions (go-version-detect, dotnet-version-detect) - Enhance node-setup with extensive configuration options - Improve error handling and validation across all actions - Update package.json scripts for better development workflow - Expand TODO.md with detailed roadmap and improvement plans - Standardize action structure with consistent inputs/outputs * feat: add comprehensive output handling across all actions - Add standardized outputs to 15 actions that previously had none - Implement consistent snake_case naming convention for all outputs - Add build status and test results outputs to build actions - Add files changed and status outputs to lint/fix actions - Add test execution metrics to php-tests action - Add stale/closed counts to stale action - Add release URLs and IDs to github-release action - Update documentation with output specifications - Mark comprehensive output handling task as complete in TODO.md * feat: implement shared cache strategy across all actions - Add caching to 10 actions that previously had none (Node.js, .NET, Python, Go) - Standardize 4 existing actions to use common-cache instead of direct actions/cache - Implement consistent cache-hit optimization to skip installations when cache available - Add language-specific cache configurations with appropriate key files - Create unified caching approach using ivuorinen/actions/common-cache@main - Fix YAML syntax error in php-composer action paths parameter - Update TODO.md to mark shared cache strategy as complete * feat: implement comprehensive retry logic for network operations - Create new common-retry action for standardized retry patterns with configurable strategies - Add retry logic to 9 actions missing network retry capabilities - Implement exponential backoff, custom timeouts, and flexible error handling - Add max-retries input parameter to all network-dependent actions (Node.js, .NET, Python, Go) - Standardize existing retry implementations to use common-retry utility - Update action catalog to include new common-retry action (41 total actions) - Update documentation with retry configuration examples and parameters - Mark retry logic implementation as complete in TODO.md roadmap * feat: enhance Node.js support with Corepack and Bun - Add Corepack support for automatic package manager version management - Add Bun package manager support across all Node.js actions - Improve Yarn Berry/PnP support with .yarnrc.yml detection - Add Node.js feature detection (ESM, TypeScript, frameworks) - Update package manager detection priority and lockfile support - Enhance caching with package-manager-specific keys - Update eslint, prettier, and biome actions for multi-package-manager support * fix: resolve critical runtime issues across multiple actions - Fix token validation by removing ineffective literal string comparisons - Add missing @microsoft/eslint-formatter-sarif dependency for SARIF output - Fix Bash variable syntax errors in username and changelog length checks - Update Dockerfile version regex to handle tags with suffixes (e.g., -alpine) - Simplify version selection logic with single grep command - Fix command execution in retry action with proper bash -c wrapper - Correct step output references using .outcome instead of .outputs.outcome - Add missing step IDs for version detection actions - Include go.mod in cache key files for accurate invalidation - Require minor version in all version regex patterns - Improve Bun installation security by verifying script before execution - Replace bc with sort -V for portable PHP version comparison - Remove non-existent pre-commit output references These fixes ensure proper runtime behavior, improved security, and better cross-platform compatibility across all affected actions. * fix: resolve critical runtime and security issues across actions - Fix biome-fix files_changed calculation using git diff instead of git status delta - Fix compress-images output description and add absolute path validation - Remove csharp-publish token default and fix token fallback in push commands - Add @microsoft/eslint-formatter-sarif to all package managers in eslint-check - Fix eslint-check command syntax by using variable assignment - Improve node-setup Bun installation security and remove invalid frozen-lockfile flag - Fix pre-commit token validation by removing ineffective literal comparison - Fix prettier-fix token comparison and expand regex for all GitHub token types - Add version-file-parser regex validation safety and fix csproj wildcard handling These fixes address security vulnerabilities, runtime errors, and functional issues to ensure reliable operation across all affected GitHub Actions. * feat: enhance Docker actions with advanced multi-architecture support Major enhancement to Docker build and publish actions with comprehensive multi-architecture capabilities and enterprise-grade features. Added features: - Advanced buildx configuration (version control, cache modes, build contexts) - Auto-detect platforms for dynamic architecture discovery - Performance optimizations with enhanced caching strategies - Security scanning with Trivy and image signing with Cosign - SBOM generation in multiple formats with validation - Verbose logging and dry-run modes for debugging - Platform-specific build args and fallback mechanisms Enhanced all Docker actions: - docker-build: Core buildx features and multi-arch support - docker-publish-gh: GitHub Packages with security features - docker-publish-hub: Docker Hub with scanning and signing - docker-publish: Orchestrator with unified configuration Updated documentation across all modified actions. * fix: resolve documentation generation placeholder issue Fixed Makefile and package.json to properly replace placeholder tokens in generated documentation, ensuring all README files show correct repository paths instead of ***PROJECT***@***VERSION***. * chore: simplify github token validation * chore(lint): optional yamlfmt, config and fixes * feat: use relative `uses` names * feat: comprehensive testing infrastructure and Python validation system - Migrate from tests/ to _tests/ directory structure with ShellSpec framework - Add comprehensive validation system with Python-based input validation - Implement dual testing approach (ShellSpec + pytest) for complete coverage - Add modern Python tooling (uv, ruff, pytest-cov) and dependencies - Create centralized validation rules with automatic generation system - Update project configuration and build system for new architecture - Enhance documentation to reflect current testing capabilities This establishes a robust foundation for action validation and testing with extensive coverage across all GitHub Actions in the repository. * chore: remove Dockerfile for now * chore: code review fixes * feat: comprehensive GitHub Actions restructuring and tooling improvements This commit represents a major restructuring of the GitHub Actions monorepo with improved tooling, testing infrastructure, and comprehensive PR #186 review implementation. ## Major Changes ### 🔧 Development Tooling & Configuration - **Shellcheck integration**: Exclude shellspec test files from linting - Updated .pre-commit-config.yaml to exclude _tests/*.sh from shellcheck/shfmt - Modified Makefile shellcheck pattern to skip shellspec files - Updated CLAUDE.md documentation with proper exclusion syntax - **Testing infrastructure**: Enhanced Python validation framework - Fixed nested if statements and boolean parameter issues in validation.py - Improved code quality with explicit keyword arguments - All pre-commit hooks now passing ### 🏗️ Project Structure & Documentation - **Added Serena AI integration** with comprehensive project memories: - Project overview, structure, and technical stack documentation - Code style conventions and completion requirements - Comprehensive PR #186 review analysis and implementation tracking - **Enhanced configuration**: Updated .gitignore, .yamlfmt.yml, pyproject.toml - **Improved testing**: Added integration workflows and enhanced test specs ### 🚀 GitHub Actions Improvements (30+ actions updated) - **Centralized validation**: Updated 41 validation rule files - **Enhanced actions**: Improvements across all action categories: - Setup actions (node-setup, version detectors) - Utility actions (version-file-parser, version-validator) - Linting actions (biome, eslint, terraform-lint-fix major refactor) - Build/publish actions (docker-build, npm-publish, csharp-*) - Repository management actions ### 📝 Documentation Updates - **README consistency**: Updated version references across action READMEs - **Enhanced documentation**: Improved action descriptions and usage examples - **CLAUDE.md**: Updated with current tooling and best practices ## Technical Improvements - **Security enhancements**: Input validation and sanitization improvements - **Performance optimizations**: Streamlined action logic and dependencies - **Cross-platform compatibility**: Better Windows/macOS/Linux support - **Error handling**: Improved error reporting and user feedback ## Files Changed - 100 files changed - 13 new Serena memory files documenting project state - 41 validation rules updated for consistency - 30+ GitHub Actions and READMEs improved - Core tooling configuration enhanced * feat: comprehensive GitHub Actions improvements and PR review fixes Major Infrastructure Improvements: - Add comprehensive testing framework with 17+ ShellSpec validation tests - Implement Docker-based testing tools with automated test runner - Add CodeRabbit configuration for automated code reviews - Restructure documentation and memory management system - Update validation rules for 25+ actions with enhanced input validation - Modernize CI/CD workflows and testing infrastructure Critical PR Review Fixes (All Issues Resolved): - Fix double caching in node-setup (eliminate redundant cache operations) - Optimize shell pipeline in version-file-parser (single awk vs complex pipeline) - Fix GitHub expression interpolation in prettier-check cache keys - Resolve terraform command order issue (validation after setup) - Add missing flake8-sarif dependency for Python SARIF output - Fix environment variable scope in pr-lint (export to GITHUB_ENV) Performance & Reliability: - Eliminate duplicate cache operations saving CI time - Improve shell script efficiency with optimized parsing - Fix command execution dependencies preventing runtime failures - Ensure proper dependency installation for all linting tools - Resolve workflow conditional logic issues Security & Quality: - All input validation rules updated with latest security patterns - Cross-platform compatibility improvements maintained - Comprehensive error handling and retry logic preserved - Modern development tooling and best practices adopted This commit addresses 100% of actionable feedback from PR review analysis, implements comprehensive testing infrastructure, and maintains high code quality standards across all 41 GitHub Actions. * feat: enhance expression handling and version parsing - Fix node-setup force-version expression logic for proper empty string handling - Improve version-file-parser with secure regex validation and enhanced Python detection - Add CodeRabbit configuration for CalVer versioning and README review guidance * feat(validate-inputs): implement modular validation system - Add modular validator architecture with specialized validators - Implement base validator classes for different input types - Add validators: boolean, docker, file, network, numeric, security, token, version - Add convention mapper for automatic input validation - Add comprehensive documentation for the validation system - Implement PCRE regex support and injection protection * feat(validate-inputs): add validation rules for all actions - Add YAML validation rules for 42 GitHub Actions - Auto-generated rules with convention mappings - Include metadata for validation coverage and quality indicators - Mark rules as auto-generated to prevent manual edits * test(validate-inputs): add comprehensive test suite for validators - Add unit tests for all validator modules - Add integration tests for the validation system - Add fixtures for version test data - Test coverage for boolean, docker, file, network, numeric, security, token, and version validators - Add tests for convention mapper and registry * feat(tools): add validation scripts and utilities - Add update-validators.py script for auto-generating rules - Add benchmark-validator.py for performance testing - Add debug-validator.py for troubleshooting - Add generate-tests.py for test generation - Add check-rules-not-manually-edited.sh for CI validation - Add fix-local-action-refs.py tool for fixing action references * feat(actions): add CustomValidator.py files for specialized validation - Add custom validators for actions requiring special validation logic - Implement validators for docker, go, node, npm, php, python, terraform actions - Add specialized validation for compress-images, common-cache, common-file-check - Implement version detection validators with language-specific logic - Add validation for build arguments, architectures, and version formats * test: update ShellSpec test framework for Python validation - Update all validation.spec.sh files to use Python validator - Add shared validation_core.py for common test utilities - Remove obsolete bash validation helpers - Update test output expectations for Python validator format - Add codeql-analysis test suite - Refactor framework utilities for Python integration - Remove deprecated test files * feat(actions): update action.yml files to use validate-inputs - Replace inline bash validation with validate-inputs action - Standardize validation across all 42 actions - Add new codeql-analysis action - Update action metadata and branding - Add validation step as first step in composite actions - Maintain backward compatibility with existing inputs/outputs * ci: update GitHub workflows for enhanced security and testing - Add new codeql-new.yml workflow - Update security scanning workflows - Enhance dependency review configuration - Update test-actions workflow for new validation system - Improve workflow permissions and security settings - Update action versions to latest SHA-pinned releases * build: update build configuration and dependencies - Update Makefile with new validation targets - Add Python dependencies in pyproject.toml - Update npm dependencies and scripts - Enhance Docker testing tools configuration - Add targets for validator updates and local ref fixes - Configure uv for Python package management * chore: update linting and documentation configuration - Update EditorConfig settings for consistent formatting - Enhance pre-commit hooks configuration - Update prettier and yamllint ignore patterns - Update gitleaks security scanning rules - Update CodeRabbit review configuration - Update CLAUDE.md with latest project standards and rules * docs: update Serena memory files and project metadata - Remove obsolete PR-186 memory files - Update project overview with current architecture - Update project structure documentation - Add quality standards and communication guidelines - Add modular validator architecture documentation - Add shellspec testing framework documentation - Update project.yml with latest configuration * feat: moved rules.yml to same folder as action, fixes * fix(validators): correct token patterns and fix validator bugs - Fix GitHub classic PAT pattern: ghp_ + 36 chars = 40 total - Fix GitHub fine-grained PAT pattern: github_pat_ + 71 chars = 82 total - Initialize result variable in convention_mapper to prevent UnboundLocalError - Fix empty URL validation in network validator to return error - Add GitHub expression check to docker architectures validator - Update docker-build CustomValidator parallel-builds max to 16 * test(validators): fix test fixtures and expectations - Fix token lengths in test data: github_pat 71 chars, ghp/gho 36 chars - Update integration tests with correct token lengths - Fix file validator test to expect absolute paths rejected for security - Rename TestGenerator import to avoid pytest collection warning - Update custom validator tests with correct input names - Change docker-build tests: platforms->architectures, tags->tag - Update docker-publish tests to match new registry enum validation * test(shellspec): fix token lengths in test helpers and specs - Fix default token lengths in spec_helper.sh to use correct 40-char format - Update csharp-publish default tokens in 4 locations - Update codeql-analysis default tokens in 2 locations - Fix codeql-analysis test tokens to correct lengths (40 and 82 chars) - Fix npm-publish fine-grained token test to use 82-char format * feat(actions): add permissions documentation and environment variable usage - Add permissions comments to all action.yml files documenting required GitHub permissions - Convert direct input usage to environment variables in shell steps for security - Add validation steps with proper error handling - Update input descriptions and add security notes where applicable - Ensure all actions follow consistent patterns for input validation * chore(workflows): update GitHub Actions workflow versions - Update workflow action versions to latest - Improve workflow consistency and maintainability * docs(security): add comprehensive security policy - Document security features and best practices - Add vulnerability reporting process - Include audit history and security testing information * docs(memory): add GitHub workflow reference documentation - Add GitHub Actions workflow commands reference - Add GitHub workflow expressions guide - Add secure workflow usage patterns and best practices * chore: token optimization, code style conventions * chore: cr fixes * fix: trivy reported Dockerfile problems * fix(security): more security fixes * chore: dockerfile and make targets for publishing * fix(ci): add creds to test-actions workflow * fix: security fix and checkout step to codeql-new * chore: test fixes * fix(security): codeql detected issues * chore: code review fixes, ReDos protection * style: apply MegaLinter fixes * fix(ci): missing packages read permission * fix(ci): add missing working directory setting * chore: linting, add validation-regex to use regex_pattern * chore: code review fixes * chore(deps): update actions * fix(security): codeql fixes * chore(cr): apply cr comments * chore: improve POSIX compatibility * chore(cr): apply cr comments * fix: codeql warning in Dockerfile, build failures * chore(cr): apply cr comments * fix: docker-testing-tools/Dockerfile * chore(cr): apply cr comments * fix(docker): update testing-tools image for GitHub Actions compatibility * chore(cr): apply cr comments * feat: add more tests, fix issues * chore: fix codeql issues, update actions * chore(cr): apply cr comments * fix: integration tests * chore: deduplication and fixes * style: apply MegaLinter fixes * chore(cr): apply cr comments * feat: dry-run mode for generate-tests * fix(ci): kcov installation * chore(cr): apply cr comments * chore(cr): apply cr comments * chore(cr): apply cr comments * chore(cr): apply cr comments, simplify action testing, use uv * fix: run-tests.sh action counting * chore(cr): apply cr comments * chore(cr): apply cr comments
This commit is contained in:
430
docker-build/CustomValidator.py
Executable file
430
docker-build/CustomValidator.py
Executable file
@@ -0,0 +1,430 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Custom validator for docker-build action.
|
||||
|
||||
This validator handles complex Docker build validation including:
|
||||
- Dockerfile path validation
|
||||
- Build context validation
|
||||
- Platform validation (linux/amd64, linux/arm64, etc.)
|
||||
- Build argument format validation
|
||||
- Tag format validation
|
||||
"""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Add validate-inputs directory to path to import validators
|
||||
validate_inputs_path = Path(__file__).parent.parent / "validate-inputs"
|
||||
sys.path.insert(0, str(validate_inputs_path))
|
||||
|
||||
from validators.base import BaseValidator
|
||||
from validators.boolean import BooleanValidator
|
||||
from validators.docker import DockerValidator
|
||||
from validators.file import FileValidator
|
||||
from validators.numeric import NumericValidator
|
||||
from validators.version import VersionValidator
|
||||
|
||||
|
||||
class CustomValidator(BaseValidator):
|
||||
"""Custom validator for docker-build action.
|
||||
|
||||
Validates Docker build-specific inputs with complex rules.
|
||||
"""
|
||||
|
||||
def __init__(self, action_type: str = "docker-build") -> None:
|
||||
"""Initialize the docker-build validator."""
|
||||
super().__init__(action_type)
|
||||
self.docker_validator = DockerValidator(action_type)
|
||||
self.file_validator = FileValidator(action_type)
|
||||
self.boolean_validator = BooleanValidator(action_type)
|
||||
self.numeric_validator = NumericValidator(action_type)
|
||||
self.version_validator = VersionValidator(action_type)
|
||||
|
||||
def validate_inputs(self, inputs: dict[str, str]) -> bool:
|
||||
"""Validate docker-build specific inputs.
|
||||
|
||||
Args:
|
||||
inputs: Dictionary of input names to values
|
||||
|
||||
Returns:
|
||||
True if all validations pass, False otherwise
|
||||
"""
|
||||
valid = True
|
||||
|
||||
# Validate required inputs
|
||||
valid &= self.validate_required_inputs(inputs)
|
||||
|
||||
# Validate dockerfile path
|
||||
if inputs.get("dockerfile"):
|
||||
valid &= self.validate_dockerfile(inputs["dockerfile"])
|
||||
|
||||
# Validate context path
|
||||
if inputs.get("context"):
|
||||
valid &= self.validate_context(inputs["context"])
|
||||
|
||||
# Validate image name
|
||||
if inputs.get("image-name"):
|
||||
result = self.docker_validator.validate_image_name(inputs["image-name"], "image-name")
|
||||
# Propagate errors from docker validator
|
||||
for error in self.docker_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.docker_validator.clear_errors()
|
||||
valid &= result
|
||||
|
||||
# Validate tag (singular - as per action.yml)
|
||||
if inputs.get("tag"):
|
||||
result = self.docker_validator.validate_docker_tag(inputs["tag"], "tag")
|
||||
# Propagate errors
|
||||
for error in self.docker_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.docker_validator.clear_errors()
|
||||
valid &= result
|
||||
|
||||
# Validate architectures/platforms
|
||||
if inputs.get("architectures"):
|
||||
result = self.docker_validator.validate_architectures(
|
||||
inputs["architectures"], "architectures"
|
||||
)
|
||||
# Propagate errors
|
||||
for error in self.docker_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.docker_validator.clear_errors()
|
||||
valid &= result
|
||||
|
||||
# Validate build arguments
|
||||
if inputs.get("build-args"):
|
||||
valid &= self.validate_build_args(inputs["build-args"])
|
||||
|
||||
# Validate push flag
|
||||
if inputs.get("push"):
|
||||
result = self.boolean_validator.validate_optional_boolean(inputs["push"], "push")
|
||||
for error in self.boolean_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.boolean_validator.clear_errors()
|
||||
valid &= result
|
||||
|
||||
# Validate cache settings
|
||||
if inputs.get("cache-from"):
|
||||
valid &= self.validate_cache_from(inputs["cache-from"])
|
||||
|
||||
if inputs.get("cache-to"):
|
||||
valid &= self.validate_cache_to(inputs["cache-to"])
|
||||
|
||||
# Validate cache-mode
|
||||
if inputs.get("cache-mode"):
|
||||
valid &= self.validate_cache_mode(inputs["cache-mode"])
|
||||
|
||||
# Validate buildx-version
|
||||
if inputs.get("buildx-version"):
|
||||
valid &= self.validate_buildx_version(inputs["buildx-version"])
|
||||
|
||||
# Validate parallel-builds
|
||||
if inputs.get("parallel-builds"):
|
||||
result = self.numeric_validator.validate_numeric_range(
|
||||
inputs["parallel-builds"], min_val=0, max_val=16, name="parallel-builds"
|
||||
)
|
||||
for error in self.numeric_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.numeric_validator.clear_errors()
|
||||
valid &= result
|
||||
|
||||
# Validate boolean flags
|
||||
for bool_input in [
|
||||
"dry-run",
|
||||
"verbose",
|
||||
"platform-fallback",
|
||||
"scan-image",
|
||||
"sign-image",
|
||||
"auto-detect-platforms",
|
||||
]:
|
||||
if inputs.get(bool_input):
|
||||
result = self.boolean_validator.validate_optional_boolean(
|
||||
inputs[bool_input], bool_input
|
||||
)
|
||||
for error in self.boolean_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.boolean_validator.clear_errors()
|
||||
valid &= result
|
||||
|
||||
# Validate sbom-format
|
||||
if inputs.get("sbom-format"):
|
||||
valid &= self.validate_sbom_format(inputs["sbom-format"])
|
||||
|
||||
# Validate max-retries
|
||||
if inputs.get("max-retries"):
|
||||
result = self.numeric_validator.validate_numeric_range(
|
||||
inputs["max-retries"], min_val=0, max_val=10, name="max-retries"
|
||||
)
|
||||
for error in self.numeric_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.numeric_validator.clear_errors()
|
||||
valid &= result
|
||||
|
||||
return valid
|
||||
|
||||
def get_required_inputs(self) -> list[str]:
|
||||
"""Get list of required inputs for docker-build.
|
||||
|
||||
Returns:
|
||||
List of required input names
|
||||
"""
|
||||
# Tag is the only required input according to action.yml
|
||||
return ["tag"]
|
||||
|
||||
def get_validation_rules(self) -> dict:
|
||||
"""Get validation rules for docker-build.
|
||||
|
||||
Returns:
|
||||
Dictionary of validation rules
|
||||
"""
|
||||
return {
|
||||
"dockerfile": "Path to Dockerfile (default: ./Dockerfile)",
|
||||
"context": "Build context path (default: .)",
|
||||
"tag": "Docker image tag (required)",
|
||||
"architectures": "Comma-separated list of platforms",
|
||||
"build-args": "Build arguments in KEY=value format",
|
||||
"push": "Whether to push the image (true/false)",
|
||||
"cache-from": "Cache sources",
|
||||
"cache-to": "Cache destinations",
|
||||
"cache-mode": "Cache mode (min, max, or inline)",
|
||||
"buildx-version": "Docker Buildx version",
|
||||
"sbom-format": "SBOM format (spdx-json, cyclonedx-json, or syft-json)",
|
||||
"parallel-builds": "Number of parallel builds (0-16)",
|
||||
}
|
||||
|
||||
def validate_dockerfile(self, dockerfile: str) -> bool:
|
||||
"""Validate Dockerfile path.
|
||||
|
||||
Args:
|
||||
dockerfile: Path to Dockerfile
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(dockerfile):
|
||||
return True
|
||||
|
||||
# Use file validator for path validation
|
||||
result = self.file_validator.validate_file_path(dockerfile, "dockerfile")
|
||||
# Propagate errors
|
||||
for error in self.file_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.file_validator.clear_errors()
|
||||
|
||||
return result
|
||||
|
||||
def validate_context(self, context: str) -> bool:
|
||||
"""Validate build context path.
|
||||
|
||||
Args:
|
||||
context: Build context path
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(context):
|
||||
return True
|
||||
|
||||
# Allow current directory
|
||||
if context in [".", "./", ""]:
|
||||
return True
|
||||
|
||||
# Note: The test says "accepts path traversal in context (no validation in action)"
|
||||
# This means we should NOT validate for path traversal in context
|
||||
# We allow path traversal for context as Docker needs to access parent directories
|
||||
# Only check for command injection patterns like ; | ` $()
|
||||
dangerous_chars = [";", "|", "`", "$(", "&&", "||"]
|
||||
for char in dangerous_chars:
|
||||
if char in context:
|
||||
self.add_error(f"Command injection detected in context: {context}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def validate_platforms(self, platforms: str) -> bool:
|
||||
"""Validate platform list.
|
||||
|
||||
Args:
|
||||
platforms: Comma-separated platform list
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Use docker validator for architectures
|
||||
result = self.docker_validator.validate_architectures(platforms, "platforms")
|
||||
# Propagate errors
|
||||
for error in self.docker_validator.errors:
|
||||
if error not in self.errors:
|
||||
self.add_error(error)
|
||||
self.docker_validator.clear_errors()
|
||||
|
||||
return result
|
||||
|
||||
def validate_build_args(self, build_args: str) -> bool:
|
||||
"""Validate build arguments.
|
||||
|
||||
Args:
|
||||
build_args: Build arguments in KEY=value format
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(build_args):
|
||||
return True
|
||||
|
||||
# Build args can be comma-separated or newline-separated
|
||||
# Split by both
|
||||
args = build_args.replace(",", "\n").strip().split("\n")
|
||||
|
||||
for arg in args:
|
||||
arg = arg.strip()
|
||||
if not arg:
|
||||
continue
|
||||
|
||||
# Check for KEY=value format
|
||||
if "=" not in arg:
|
||||
self.add_error(f"Build argument must be in KEY=value format: {arg}")
|
||||
return False
|
||||
|
||||
key, value = arg.split("=", 1)
|
||||
|
||||
# Validate key format
|
||||
if not key:
|
||||
self.add_error("Build argument key cannot be empty")
|
||||
return False
|
||||
|
||||
# Check for security issues in values
|
||||
if not self.validate_security_patterns(value, f"build-arg {key}"):
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def validate_cache_from(self, cache_from: str) -> bool:
|
||||
"""Validate cache-from sources.
|
||||
|
||||
Args:
|
||||
cache_from: Cache sources
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(cache_from):
|
||||
return True
|
||||
|
||||
# Basic format validation for cache sources
|
||||
# Format: type=registry,ref=user/app:cache
|
||||
if "type=" not in cache_from:
|
||||
self.add_error("cache-from must specify type (e.g., type=registry,ref=...)")
|
||||
return False
|
||||
|
||||
# Check for security issues
|
||||
return self.validate_security_patterns(cache_from, "cache-from")
|
||||
|
||||
def validate_cache_to(self, cache_to: str) -> bool:
|
||||
"""Validate cache-to destinations.
|
||||
|
||||
Args:
|
||||
cache_to: Cache destinations
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(cache_to):
|
||||
return True
|
||||
|
||||
# Basic format validation for cache destinations
|
||||
if "type=" not in cache_to:
|
||||
self.add_error("cache-to must specify type (e.g., type=registry,ref=...)")
|
||||
return False
|
||||
|
||||
# Check for security issues
|
||||
return self.validate_security_patterns(cache_to, "cache-to")
|
||||
|
||||
def validate_cache_mode(self, cache_mode: str) -> bool:
|
||||
"""Validate cache mode.
|
||||
|
||||
Args:
|
||||
cache_mode: Cache mode value
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(cache_mode):
|
||||
return True
|
||||
|
||||
# Valid cache modes
|
||||
valid_modes = ["min", "max", "inline"]
|
||||
if cache_mode.lower() not in valid_modes:
|
||||
self.add_error(f"Invalid cache-mode: {cache_mode}. Must be one of: min, max, inline")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def validate_buildx_version(self, version: str) -> bool:
|
||||
"""Validate buildx version.
|
||||
|
||||
Args:
|
||||
version: Buildx version
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(version):
|
||||
return True
|
||||
|
||||
# Allow 'latest'
|
||||
if version == "latest":
|
||||
return True
|
||||
|
||||
# Check for security issues (semicolon injection etc)
|
||||
if not self.validate_security_patterns(version, "buildx-version"):
|
||||
return False
|
||||
|
||||
# Basic version format validation (e.g., 0.12.0, v0.12.0)
|
||||
import re
|
||||
|
||||
if not re.match(r"^v?\d+\.\d+(\.\d+)?$", version):
|
||||
self.add_error(f"Invalid buildx-version format: {version}")
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
def validate_sbom_format(self, sbom_format: str) -> bool:
|
||||
"""Validate SBOM format.
|
||||
|
||||
Args:
|
||||
sbom_format: SBOM format value
|
||||
|
||||
Returns:
|
||||
True if valid, False otherwise
|
||||
"""
|
||||
# Allow GitHub Actions expressions
|
||||
if self.is_github_expression(sbom_format):
|
||||
return True
|
||||
|
||||
# Valid SBOM formats
|
||||
valid_formats = ["spdx-json", "cyclonedx-json", "syft-json"]
|
||||
if sbom_format.lower() not in valid_formats:
|
||||
self.add_error(
|
||||
f"Invalid sbom-format: {sbom_format}. "
|
||||
"Must be one of: spdx-json, cyclonedx-json, syft-json"
|
||||
)
|
||||
return False
|
||||
|
||||
return True
|
||||
@@ -8,25 +8,48 @@ Builds a Docker image for multiple architectures with enhanced security and reli
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
|-----------------|-------------------------------------------------------------------------------------|----------|-----------------------------------------------------|
|
||||
| `image-name` | <p>The name of the Docker image to build. Defaults to the repository name.</p> | `false` | `""` |
|
||||
| `tag` | <p>The tag for the Docker image. Must follow semver or valid Docker tag format.</p> | `true` | `""` |
|
||||
| `architectures` | <p>Comma-separated list of architectures to build for.</p> | `false` | `linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6` |
|
||||
| `dockerfile` | <p>Path to the Dockerfile</p> | `false` | `Dockerfile` |
|
||||
| `context` | <p>Docker build context</p> | `false` | `.` |
|
||||
| `build-args` | <p>Build arguments in format KEY=VALUE,KEY2=VALUE2</p> | `false` | `""` |
|
||||
| `cache-from` | <p>External cache sources (e.g., type=registry,ref=user/app:cache)</p> | `false` | `""` |
|
||||
| `push` | <p>Whether to push the image after building</p> | `false` | `true` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts for build and push operations</p> | `false` | `3` |
|
||||
| name | description | required | default |
|
||||
|-------------------------|-------------------------------------------------------------------------------------|----------|-----------------------------------------------------|
|
||||
| `image-name` | <p>The name of the Docker image to build. Defaults to the repository name.</p> | `false` | `""` |
|
||||
| `tag` | <p>The tag for the Docker image. Must follow semver or valid Docker tag format.</p> | `true` | `""` |
|
||||
| `architectures` | <p>Comma-separated list of architectures to build for.</p> | `false` | `linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6` |
|
||||
| `dockerfile` | <p>Path to the Dockerfile</p> | `false` | `Dockerfile` |
|
||||
| `context` | <p>Docker build context</p> | `false` | `.` |
|
||||
| `build-args` | <p>Build arguments in format KEY=VALUE,KEY2=VALUE2</p> | `false` | `""` |
|
||||
| `cache-from` | <p>External cache sources (e.g., type=registry,ref=user/app:cache)</p> | `false` | `""` |
|
||||
| `push` | <p>Whether to push the image after building</p> | `false` | `true` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts for build and push operations</p> | `false` | `3` |
|
||||
| `token` | <p>GitHub token for authentication</p> | `false` | `""` |
|
||||
| `buildx-version` | <p>Specific Docker Buildx version to use</p> | `false` | `latest` |
|
||||
| `buildkit-version` | <p>Specific BuildKit version to use</p> | `false` | `v0.11.0` |
|
||||
| `cache-mode` | <p>Cache mode for build layers (min, max, or inline)</p> | `false` | `max` |
|
||||
| `build-contexts` | <p>Additional build contexts in format name=path,name2=path2</p> | `false` | `""` |
|
||||
| `network` | <p>Network mode for build (host, none, or default)</p> | `false` | `default` |
|
||||
| `secrets` | <p>Build secrets in format id=path,id2=path2</p> | `false` | `""` |
|
||||
| `auto-detect-platforms` | <p>Automatically detect and build for all available platforms</p> | `false` | `false` |
|
||||
| `platform-build-args` | <p>Platform-specific build args in JSON format</p> | `false` | `""` |
|
||||
| `parallel-builds` | <p>Number of parallel platform builds (0 for auto)</p> | `false` | `0` |
|
||||
| `cache-export` | <p>Export cache destination (e.g., type=local,dest=/tmp/cache)</p> | `false` | `""` |
|
||||
| `cache-import` | <p>Import cache sources (e.g., type=local,src=/tmp/cache)</p> | `false` | `""` |
|
||||
| `dry-run` | <p>Perform a dry run without actually building</p> | `false` | `false` |
|
||||
| `verbose` | <p>Enable verbose logging with platform-specific output</p> | `false` | `false` |
|
||||
| `platform-fallback` | <p>Continue building other platforms if one fails</p> | `false` | `true` |
|
||||
| `scan-image` | <p>Scan built image for vulnerabilities</p> | `false` | `false` |
|
||||
| `sign-image` | <p>Sign the built image with cosign</p> | `false` | `false` |
|
||||
| `sbom-format` | <p>SBOM format (spdx-json, cyclonedx-json, or syft-json)</p> | `false` | `spdx-json` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
|----------------|--------------------------------------|
|
||||
| `image-digest` | <p>The digest of the built image</p> |
|
||||
| `metadata` | <p>Build metadata in JSON format</p> |
|
||||
| `platforms` | <p>Successfully built platforms</p> |
|
||||
| name | description |
|
||||
|-------------------|-------------------------------------------------------|
|
||||
| `image-digest` | <p>The digest of the built image</p> |
|
||||
| `metadata` | <p>Build metadata in JSON format</p> |
|
||||
| `platforms` | <p>Successfully built platforms</p> |
|
||||
| `platform-matrix` | <p>Build status per platform in JSON format</p> |
|
||||
| `build-time` | <p>Total build time in seconds</p> |
|
||||
| `scan-results` | <p>Vulnerability scan results if scanning enabled</p> |
|
||||
| `signature` | <p>Image signature if signing enabled</p> |
|
||||
| `sbom-location` | <p>SBOM document location</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
@@ -90,4 +113,112 @@ This action is a `composite` action.
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
|
||||
token:
|
||||
# GitHub token for authentication
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
buildx-version:
|
||||
# Specific Docker Buildx version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: latest
|
||||
|
||||
buildkit-version:
|
||||
# Specific BuildKit version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: v0.11.0
|
||||
|
||||
cache-mode:
|
||||
# Cache mode for build layers (min, max, or inline)
|
||||
#
|
||||
# Required: false
|
||||
# Default: max
|
||||
|
||||
build-contexts:
|
||||
# Additional build contexts in format name=path,name2=path2
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
network:
|
||||
# Network mode for build (host, none, or default)
|
||||
#
|
||||
# Required: false
|
||||
# Default: default
|
||||
|
||||
secrets:
|
||||
# Build secrets in format id=path,id2=path2
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
auto-detect-platforms:
|
||||
# Automatically detect and build for all available platforms
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
|
||||
platform-build-args:
|
||||
# Platform-specific build args in JSON format
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
parallel-builds:
|
||||
# Number of parallel platform builds (0 for auto)
|
||||
#
|
||||
# Required: false
|
||||
# Default: 0
|
||||
|
||||
cache-export:
|
||||
# Export cache destination (e.g., type=local,dest=/tmp/cache)
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
cache-import:
|
||||
# Import cache sources (e.g., type=local,src=/tmp/cache)
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
dry-run:
|
||||
# Perform a dry run without actually building
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
|
||||
verbose:
|
||||
# Enable verbose logging with platform-specific output
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
|
||||
platform-fallback:
|
||||
# Continue building other platforms if one fails
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
scan-image:
|
||||
# Scan built image for vulnerabilities
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
|
||||
sign-image:
|
||||
# Sign the built image with cosign
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
|
||||
sbom-format:
|
||||
# SBOM format (spdx-json, cyclonedx-json, or syft-json)
|
||||
#
|
||||
# Required: false
|
||||
# Default: spdx-json
|
||||
```
|
||||
|
||||
@@ -1,5 +1,7 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
# permissions:
|
||||
# - (none required) # Build action, publishing handled by separate actions
|
||||
---
|
||||
name: Docker Build
|
||||
description: 'Builds a Docker image for multiple architectures with enhanced security and reliability.'
|
||||
author: 'Ismo Vuorinen'
|
||||
@@ -41,6 +43,73 @@ inputs:
|
||||
description: 'Maximum number of retry attempts for build and push operations'
|
||||
required: false
|
||||
default: '3'
|
||||
token:
|
||||
description: 'GitHub token for authentication'
|
||||
required: false
|
||||
default: ''
|
||||
buildx-version:
|
||||
description: 'Specific Docker Buildx version to use'
|
||||
required: false
|
||||
default: 'latest'
|
||||
buildkit-version:
|
||||
description: 'Specific BuildKit version to use'
|
||||
required: false
|
||||
default: 'v0.11.0'
|
||||
cache-mode:
|
||||
description: 'Cache mode for build layers (min, max, or inline)'
|
||||
required: false
|
||||
default: 'max'
|
||||
build-contexts:
|
||||
description: 'Additional build contexts in format name=path,name2=path2'
|
||||
required: false
|
||||
network:
|
||||
description: 'Network mode for build (host, none, or default)'
|
||||
required: false
|
||||
default: 'default'
|
||||
secrets:
|
||||
description: 'Build secrets in format id=path,id2=path2'
|
||||
required: false
|
||||
auto-detect-platforms:
|
||||
description: 'Automatically detect and build for all available platforms'
|
||||
required: false
|
||||
default: 'false'
|
||||
platform-build-args:
|
||||
description: 'Platform-specific build args in JSON format'
|
||||
required: false
|
||||
parallel-builds:
|
||||
description: 'Number of parallel platform builds (0 for auto)'
|
||||
required: false
|
||||
default: '0'
|
||||
cache-export:
|
||||
description: 'Export cache destination (e.g., type=local,dest=/tmp/cache)'
|
||||
required: false
|
||||
cache-import:
|
||||
description: 'Import cache sources (e.g., type=local,src=/tmp/cache)'
|
||||
required: false
|
||||
dry-run:
|
||||
description: 'Perform a dry run without actually building'
|
||||
required: false
|
||||
default: 'false'
|
||||
verbose:
|
||||
description: 'Enable verbose logging with platform-specific output'
|
||||
required: false
|
||||
default: 'false'
|
||||
platform-fallback:
|
||||
description: 'Continue building other platforms if one fails'
|
||||
required: false
|
||||
default: 'true'
|
||||
scan-image:
|
||||
description: 'Scan built image for vulnerabilities'
|
||||
required: false
|
||||
default: 'false'
|
||||
sign-image:
|
||||
description: 'Sign the built image with cosign'
|
||||
required: false
|
||||
default: 'false'
|
||||
sbom-format:
|
||||
description: 'SBOM format (spdx-json, cyclonedx-json, or syft-json)'
|
||||
required: false
|
||||
default: 'spdx-json'
|
||||
|
||||
outputs:
|
||||
image-digest:
|
||||
@@ -52,42 +121,45 @@ outputs:
|
||||
platforms:
|
||||
description: 'Successfully built platforms'
|
||||
value: ${{ steps.platforms.outputs.built }}
|
||||
platform-matrix:
|
||||
description: 'Build status per platform in JSON format'
|
||||
value: ${{ steps.build.outputs.platform-matrix }}
|
||||
build-time:
|
||||
description: 'Total build time in seconds'
|
||||
value: ${{ steps.build.outputs.build-time }}
|
||||
scan-results:
|
||||
description: 'Vulnerability scan results if scanning enabled'
|
||||
value: ${{ steps.scan-output.outputs.results }}
|
||||
signature:
|
||||
description: 'Image signature if signing enabled'
|
||||
value: ${{ steps.sign.outputs.signature }}
|
||||
sbom-location:
|
||||
description: 'SBOM document location'
|
||||
value: ${{ steps.build.outputs.sbom-location }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
uses: ./validate-inputs
|
||||
with:
|
||||
action-type: 'docker-build'
|
||||
image-name: ${{ inputs.image-name }}
|
||||
tag: ${{ inputs.tag }}
|
||||
architectures: ${{ inputs.architectures }}
|
||||
dockerfile: ${{ inputs.dockerfile }}
|
||||
build-args: ${{ inputs.build-args }}
|
||||
buildx-version: ${{ inputs.buildx-version }}
|
||||
parallel-builds: ${{ inputs.parallel-builds }}
|
||||
|
||||
- name: Check Dockerfile Exists
|
||||
shell: bash
|
||||
env:
|
||||
DOCKERFILE: ${{ inputs.dockerfile }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate image name
|
||||
if [ -n "${{ inputs.image-name }}" ]; then
|
||||
if ! [[ "${{ inputs.image-name }}" =~ ^[a-z0-9]+(?:[._-][a-z0-9]+)*$ ]]; then
|
||||
echo "::error::Invalid image name format. Must match ^[a-z0-9]+(?:[._-][a-z0-9]+)*$"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Validate tag
|
||||
if ! [[ "${{ inputs.tag }}" =~ ^(v?[0-9]+\.[0-9]+\.[0-9]+(-[\w.]+)?(\+[\w.]+)?|latest|[a-zA-Z][-a-zA-Z0-9._]{0,127})$ ]]; then
|
||||
echo "::error::Invalid tag format. Must be semver or valid Docker tag"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate architectures
|
||||
IFS=',' read -ra ARCHS <<< "${{ inputs.architectures }}"
|
||||
for arch in "${ARCHS[@]}"; do
|
||||
if ! [[ "$arch" =~ ^linux/(amd64|arm64|arm/v7|arm/v6|386|ppc64le|s390x)$ ]]; then
|
||||
echo "::error::Invalid architecture format: $arch"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Validate Dockerfile existence
|
||||
if [ ! -f "${{ inputs.dockerfile }}" ]; then
|
||||
echo "::error::Dockerfile not found at ${{ inputs.dockerfile }}"
|
||||
if [ ! -f "$DOCKERFILE" ]; then
|
||||
echo "::error::Dockerfile not found at $DOCKERFILE"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
@@ -100,51 +172,175 @@ runs:
|
||||
id: buildx
|
||||
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
|
||||
with:
|
||||
version: latest
|
||||
version: ${{ inputs.buildx-version }}
|
||||
platforms: ${{ inputs.architectures }}
|
||||
buildkitd-flags: --debug
|
||||
driver-opts: |
|
||||
network=${{ inputs.network }}
|
||||
image=moby/buildkit:${{ inputs.buildkit-version }}
|
||||
|
||||
- name: Detect Available Platforms
|
||||
id: detect-platforms
|
||||
if: inputs.auto-detect-platforms == 'true'
|
||||
shell: bash
|
||||
env:
|
||||
ARCHITECTURES: ${{ inputs.architectures }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Get available platforms from buildx
|
||||
available_platforms=$(docker buildx ls | grep -o 'linux/[^ ]*' | sort -u | tr '\n' ',' | sed 's/,$//')
|
||||
|
||||
if [ -n "$available_platforms" ]; then
|
||||
echo "platforms=${available_platforms}" >> $GITHUB_OUTPUT
|
||||
echo "Detected platforms: ${available_platforms}"
|
||||
else
|
||||
echo "platforms=$ARCHITECTURES" >> $GITHUB_OUTPUT
|
||||
echo "Using default platforms: $ARCHITECTURES"
|
||||
fi
|
||||
|
||||
- name: Determine Image Name
|
||||
id: image-name
|
||||
shell: bash
|
||||
env:
|
||||
IMAGE_NAME: ${{ inputs.image-name }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
if [ -z "${{ inputs.image-name }}" ]; then
|
||||
if [ -z "$IMAGE_NAME" ]; then
|
||||
repo_name=$(basename "${GITHUB_REPOSITORY}")
|
||||
echo "name=${repo_name}" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "name=${{ inputs.image-name }}" >> $GITHUB_OUTPUT
|
||||
echo "name=$IMAGE_NAME" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Parse Build Arguments
|
||||
id: build-args
|
||||
shell: bash
|
||||
env:
|
||||
BUILD_ARGS_INPUT: ${{ inputs.build-args }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
args=""
|
||||
if [ -n "${{ inputs.build-args }}" ]; then
|
||||
IFS=',' read -ra BUILD_ARGS <<< "${{ inputs.build-args }}"
|
||||
if [ -n "$BUILD_ARGS_INPUT" ]; then
|
||||
IFS=',' read -ra BUILD_ARGS <<< "$BUILD_ARGS_INPUT"
|
||||
for arg in "${BUILD_ARGS[@]}"; do
|
||||
args="$args --build-arg $arg"
|
||||
done
|
||||
fi
|
||||
echo "args=${args}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Set up Build Cache
|
||||
id: cache
|
||||
- name: Parse Build Contexts
|
||||
id: build-contexts
|
||||
shell: bash
|
||||
env:
|
||||
BUILD_CONTEXTS: ${{ inputs.build-contexts }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
contexts=""
|
||||
if [ -n "$BUILD_CONTEXTS" ]; then
|
||||
IFS=',' read -ra CONTEXTS <<< "$BUILD_CONTEXTS"
|
||||
for ctx in "${CONTEXTS[@]}"; do
|
||||
contexts="$contexts --build-context $ctx"
|
||||
done
|
||||
fi
|
||||
echo "contexts=${contexts}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Parse Secrets
|
||||
id: secrets
|
||||
shell: bash
|
||||
env:
|
||||
INPUT_SECRETS: ${{ inputs.secrets }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
secrets=""
|
||||
if [ -n "$INPUT_SECRETS" ]; then
|
||||
IFS=',' read -ra SECRETS <<< "$INPUT_SECRETS"
|
||||
for secret in "${SECRETS[@]}"; do
|
||||
# Trim whitespace
|
||||
secret=$(echo "$secret" | xargs)
|
||||
|
||||
if [[ "$secret" == *"="* ]]; then
|
||||
# Parse id=src format
|
||||
id="${secret%%=*}"
|
||||
src="${secret#*=}"
|
||||
|
||||
# Validate id and src are not empty
|
||||
if [[ -z "$id" || -z "$src" ]]; then
|
||||
echo "::error::Invalid secret format: '$secret'. Expected 'id=src' where both id and src are non-empty"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
secrets="$secrets --secret id=$id,src=$src"
|
||||
else
|
||||
# Handle legacy format - treat as id only (error for now)
|
||||
echo "::error::Invalid secret format: '$secret'. Expected 'id=src' format for Buildx compatibility"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
fi
|
||||
echo "secrets=${secrets}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Login to GitHub Container Registry
|
||||
if: ${{ inputs.push == 'true' }}
|
||||
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
|
||||
with:
|
||||
registry: ghcr.io
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ inputs.token || github.token }}
|
||||
|
||||
- name: Set up Build Cache
|
||||
id: cache
|
||||
shell: bash
|
||||
env:
|
||||
CACHE_IMPORT: ${{ inputs.cache-import }}
|
||||
CACHE_FROM: ${{ inputs.cache-from }}
|
||||
CACHE_EXPORT: ${{ inputs.cache-export }}
|
||||
PUSH: ${{ inputs.push }}
|
||||
INPUT_TOKEN: ${{ inputs.token }}
|
||||
CACHE_MODE: ${{ inputs.cache-mode }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Use provided token or fall back to GITHUB_TOKEN
|
||||
TOKEN="${INPUT_TOKEN:-${GITHUB_TOKEN:-}}"
|
||||
|
||||
cache_from=""
|
||||
if [ -n "${{ inputs.cache-from }}" ]; then
|
||||
cache_from="--cache-from ${{ inputs.cache-from }}"
|
||||
cache_to=""
|
||||
|
||||
# Handle cache import
|
||||
if [ -n "$CACHE_IMPORT" ]; then
|
||||
cache_from="--cache-from $CACHE_IMPORT"
|
||||
elif [ -n "$CACHE_FROM" ]; then
|
||||
cache_from="--cache-from $CACHE_FROM"
|
||||
fi
|
||||
|
||||
# Local cache configuration
|
||||
# Handle cache export
|
||||
if [ -n "$CACHE_EXPORT" ]; then
|
||||
cache_to="--cache-to $CACHE_EXPORT"
|
||||
fi
|
||||
|
||||
# Registry cache configuration for better performance (only if authenticated)
|
||||
if [ "$PUSH" == "true" ] || [ -n "$TOKEN" ]; then
|
||||
normalized_repo=$(echo "${GITHUB_REPOSITORY}" | tr '[:upper:]' '[:lower:]' | sed 's/[^a-z0-9._\/-]/-/g')
|
||||
registry_cache_ref="ghcr.io/${normalized_repo}/cache:latest"
|
||||
cache_from="$cache_from --cache-from type=registry,ref=$registry_cache_ref"
|
||||
|
||||
# Set cache mode
|
||||
cache_mode="$CACHE_MODE"
|
||||
if [ -z "$cache_to" ]; then
|
||||
cache_to="--cache-to type=registry,ref=$registry_cache_ref,mode=${cache_mode}"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Also include local cache as fallback
|
||||
cache_from="$cache_from --cache-from type=local,src=/tmp/.buildx-cache"
|
||||
cache_to="--cache-to type=local,dest=/tmp/.buildx-cache-new,mode=max"
|
||||
if [[ "$cache_to" != *"type=local"* ]]; then
|
||||
cache_to="$cache_to --cache-to type=local,dest=/tmp/.buildx-cache-new,mode=${cache_mode}"
|
||||
fi
|
||||
|
||||
echo "from=${cache_from}" >> $GITHUB_OUTPUT
|
||||
echo "to=${cache_to}" >> $GITHUB_OUTPUT
|
||||
@@ -152,34 +348,137 @@ runs:
|
||||
- name: Build Multi-Architecture Docker Image
|
||||
id: build
|
||||
shell: bash
|
||||
env:
|
||||
AUTO_DETECT_PLATFORMS: ${{ inputs.auto-detect-platforms }}
|
||||
DETECTED_PLATFORMS: ${{ steps.detect-platforms.outputs.platforms }}
|
||||
ARCHITECTURES: ${{ inputs.architectures }}
|
||||
PUSH: ${{ inputs.push }}
|
||||
DRY_RUN: ${{ inputs.dry-run }}
|
||||
MAX_RETRIES: ${{ inputs.max-retries }}
|
||||
VERBOSE: ${{ inputs.verbose }}
|
||||
SBOM_FORMAT: ${{ inputs.sbom-format }}
|
||||
IMAGE_NAME: ${{ steps.image-name.outputs.name }}
|
||||
TAG: ${{ inputs.tag }}
|
||||
BUILD_ARGS: ${{ steps.build-args.outputs.args }}
|
||||
BUILD_CONTEXTS: ${{ steps.build-contexts.outputs.contexts }}
|
||||
SECRETS: ${{ steps.secrets.outputs.secrets }}
|
||||
CACHE_FROM: ${{ steps.cache.outputs.from }}
|
||||
CACHE_TO: ${{ steps.cache.outputs.to }}
|
||||
DOCKERFILE: ${{ inputs.dockerfile }}
|
||||
CONTEXT: ${{ inputs.context }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Track build start time
|
||||
build_start=$(date +%s)
|
||||
|
||||
# Determine platforms to build
|
||||
if [ "$AUTO_DETECT_PLATFORMS" == "true" ] && [ -n "$DETECTED_PLATFORMS" ]; then
|
||||
platforms="$DETECTED_PLATFORMS"
|
||||
else
|
||||
platforms="$ARCHITECTURES"
|
||||
fi
|
||||
|
||||
# For local load (push=false), restrict to single platform
|
||||
if [ "$PUSH" != "true" ]; then
|
||||
# Extract first platform only for local load
|
||||
platforms=$(echo "$platforms" | cut -d',' -f1)
|
||||
echo "Local build mode: restricting to single platform: $platforms"
|
||||
fi
|
||||
|
||||
# Initialize platform matrix tracking
|
||||
platform_matrix="{}"
|
||||
|
||||
# Check for dry run
|
||||
if [ "$DRY_RUN" == "true" ]; then
|
||||
echo "[DRY RUN] Would build for platforms: $platforms"
|
||||
echo "digest=dry-run-no-digest" >> $GITHUB_OUTPUT
|
||||
echo "platform-matrix={}" >> $GITHUB_OUTPUT
|
||||
echo "build-time=0" >> $GITHUB_OUTPUT
|
||||
exit 0
|
||||
fi
|
||||
|
||||
attempt=1
|
||||
max_attempts=${{ inputs.max-retries }}
|
||||
max_attempts="$MAX_RETRIES"
|
||||
|
||||
# Prepare verbose flag
|
||||
verbose_flag=""
|
||||
if [ "$VERBOSE" == "true" ]; then
|
||||
verbose_flag="--progress=plain"
|
||||
fi
|
||||
|
||||
# Prepare SBOM options
|
||||
sbom_flag="--sbom=true"
|
||||
if [ -n "$SBOM_FORMAT" ]; then
|
||||
sbom_flag="--sbom=true --sbom-format=$SBOM_FORMAT"
|
||||
fi
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Build attempt $attempt of $max_attempts"
|
||||
|
||||
# Build command with platform restriction for local load
|
||||
if [ "$PUSH" == "true" ]; then
|
||||
build_action="--push"
|
||||
else
|
||||
build_action="--load"
|
||||
fi
|
||||
|
||||
if docker buildx build \
|
||||
--platform=${{ inputs.architectures }} \
|
||||
--tag ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} \
|
||||
${{ steps.build-args.outputs.args }} \
|
||||
${{ steps.cache.outputs.from }} \
|
||||
${{ steps.cache.outputs.to }} \
|
||||
--file ${{ inputs.dockerfile }} \
|
||||
${{ inputs.push == 'true' && '--push' || '--load' }} \
|
||||
--platform=${platforms} \
|
||||
--tag "$IMAGE_NAME:$TAG" \
|
||||
$BUILD_ARGS \
|
||||
$BUILD_CONTEXTS \
|
||||
$SECRETS \
|
||||
$CACHE_FROM \
|
||||
$CACHE_TO \
|
||||
--file "$DOCKERFILE" \
|
||||
${build_action} \
|
||||
--provenance=true \
|
||||
--sbom=true \
|
||||
${{ inputs.context }}; then
|
||||
${sbom_flag} \
|
||||
${verbose_flag} \
|
||||
--metadata-file=/tmp/build-metadata.json \
|
||||
"$CONTEXT"; then
|
||||
|
||||
# Get image digest
|
||||
digest=$(docker buildx imagetools inspect ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} --raw)
|
||||
if [ "$PUSH" == "true" ]; then
|
||||
digest=$(docker buildx imagetools inspect "$IMAGE_NAME:$TAG" --raw | jq -r '.digest // "unknown"' || echo "unknown")
|
||||
else
|
||||
digest=$(docker inspect "$IMAGE_NAME:$TAG" --format='{{.Id}}' || echo "unknown")
|
||||
fi
|
||||
echo "digest=${digest}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Parse metadata
|
||||
if [ -f /tmp/build-metadata.json ]; then
|
||||
{
|
||||
echo "metadata<<EOF"
|
||||
cat /tmp/build-metadata.json
|
||||
echo "EOF"
|
||||
} >> "$GITHUB_OUTPUT"
|
||||
|
||||
# Extract SBOM location directly from file
|
||||
sbom_location=$(jq -r '.sbom.location // ""' /tmp/build-metadata.json)
|
||||
echo "sbom-location=${sbom_location}" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
# Calculate build time
|
||||
build_end=$(date +%s)
|
||||
build_time=$((build_end - build_start))
|
||||
echo "build-time=${build_time}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Build platform matrix
|
||||
IFS=',' read -ra PLATFORM_ARRAY <<< "${platforms}"
|
||||
platform_matrix="{"
|
||||
for p in "${PLATFORM_ARRAY[@]}"; do
|
||||
platform_matrix="${platform_matrix}\"${p}\":\"success\","
|
||||
done
|
||||
platform_matrix="${platform_matrix%,}}"
|
||||
echo "platform-matrix=${platform_matrix}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Move cache
|
||||
rm -rf /tmp/.buildx-cache
|
||||
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
|
||||
if [ -d /tmp/.buildx-cache-new ]; then
|
||||
rm -rf /tmp/.buildx-cache
|
||||
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
|
||||
fi
|
||||
|
||||
break
|
||||
fi
|
||||
@@ -194,26 +493,88 @@ runs:
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Verify Build
|
||||
id: verify
|
||||
- name: Scan Image for Vulnerabilities
|
||||
id: scan
|
||||
if: inputs.scan-image == 'true' && inputs.dry-run != 'true'
|
||||
uses: aquasecurity/trivy-action@b6643a29fecd7f34b3597bc6acb0a98b03d33ff8 # 0.33.1
|
||||
with:
|
||||
scan-type: 'image'
|
||||
image-ref: ${{ steps.image-name.outputs.name }}:${{ inputs.tag }}
|
||||
format: 'json'
|
||||
output: 'trivy-results.json'
|
||||
severity: 'HIGH,CRITICAL'
|
||||
|
||||
- name: Process Scan Results
|
||||
id: scan-output
|
||||
if: inputs.scan-image == 'true' && inputs.dry-run != 'true'
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify image exists
|
||||
if ! docker buildx imagetools inspect ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} >/dev/null 2>&1; then
|
||||
echo "::error::Built image not found"
|
||||
exit 1
|
||||
# Read and format scan results for output
|
||||
scan_results=$(cat trivy-results.json | jq -c '.')
|
||||
echo "results=${scan_results}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Check for critical vulnerabilities
|
||||
critical_count=$(cat trivy-results.json | jq '.Results[] | (.Vulnerabilities // [])[] | select(.Severity == "CRITICAL") | .VulnerabilityID' | wc -l)
|
||||
if [ "$critical_count" -gt 0 ]; then
|
||||
echo "::warning::Found $critical_count critical vulnerabilities in image"
|
||||
fi
|
||||
|
||||
# Get and verify platform support
|
||||
platforms=$(docker buildx imagetools inspect ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} | grep "Platform:" | cut -d' ' -f2)
|
||||
echo "built=${platforms}" >> $GITHUB_OUTPUT
|
||||
- name: Install Cosign
|
||||
if: inputs.sign-image == 'true' && inputs.push == 'true' && inputs.dry-run != 'true'
|
||||
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
|
||||
|
||||
- name: Sign Image
|
||||
id: sign
|
||||
if: inputs.sign-image == 'true' && inputs.push == 'true' && inputs.dry-run != 'true'
|
||||
shell: bash
|
||||
env:
|
||||
IMAGE_NAME: ${{ steps.image-name.outputs.name }}
|
||||
IMAGE_TAG: ${{ inputs.tag }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Sign the image (using keyless signing with OIDC)
|
||||
export COSIGN_EXPERIMENTAL=1
|
||||
cosign sign --yes "${IMAGE_NAME}:${IMAGE_TAG}"
|
||||
|
||||
echo "signature=signed" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Verify Build
|
||||
id: verify
|
||||
if: inputs.dry-run != 'true'
|
||||
shell: bash
|
||||
env:
|
||||
PUSH: ${{ inputs.push }}
|
||||
IMAGE_NAME: ${{ steps.image-name.outputs.name }}
|
||||
IMAGE_TAG: ${{ inputs.tag }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify image exists
|
||||
if [ "$PUSH" == "true" ]; then
|
||||
if ! docker buildx imagetools inspect "${IMAGE_NAME}:${IMAGE_TAG}" >/dev/null 2>&1; then
|
||||
echo "::error::Built image not found"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Get and verify platform support
|
||||
platforms=$(docker buildx imagetools inspect "${IMAGE_NAME}:${IMAGE_TAG}" | grep "Platform:" | cut -d' ' -f2)
|
||||
echo "built=${platforms}" >> $GITHUB_OUTPUT
|
||||
else
|
||||
# For local builds, just verify it exists
|
||||
if ! docker image inspect "${IMAGE_NAME}:${IMAGE_TAG}" >/dev/null 2>&1; then
|
||||
echo "::error::Built image not found locally"
|
||||
exit 1
|
||||
fi
|
||||
echo "built=local" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
run: |-
|
||||
set -euo pipefail
|
||||
|
||||
# Cleanup temporary files
|
||||
|
||||
79
docker-build/rules.yml
Normal file
79
docker-build/rules.yml
Normal file
@@ -0,0 +1,79 @@
|
||||
---
|
||||
# Validation rules for docker-build action
|
||||
# Generated by update-validators.py v1.0.0 - DO NOT EDIT MANUALLY
|
||||
# Schema version: 1.0
|
||||
# Coverage: 63% (17/27 inputs)
|
||||
#
|
||||
# This file defines validation rules for the docker-build GitHub Action.
|
||||
# Rules are automatically applied by validate-inputs action when this
|
||||
# action is used.
|
||||
#
|
||||
|
||||
schema_version: '1.0'
|
||||
action: docker-build
|
||||
description: Builds a Docker image for multiple architectures with enhanced security and reliability.
|
||||
generator_version: 1.0.0
|
||||
required_inputs:
|
||||
- tag
|
||||
optional_inputs:
|
||||
- architectures
|
||||
- auto-detect-platforms
|
||||
- build-args
|
||||
- build-contexts
|
||||
- buildkit-version
|
||||
- buildx-version
|
||||
- cache-export
|
||||
- cache-from
|
||||
- cache-import
|
||||
- cache-mode
|
||||
- context
|
||||
- dockerfile
|
||||
- dry-run
|
||||
- image-name
|
||||
- max-retries
|
||||
- network
|
||||
- parallel-builds
|
||||
- platform-build-args
|
||||
- platform-fallback
|
||||
- push
|
||||
- sbom-format
|
||||
- scan-image
|
||||
- secrets
|
||||
- sign-image
|
||||
- token
|
||||
- verbose
|
||||
conventions:
|
||||
architectures: docker_architectures
|
||||
auto-detect-platforms: docker_architectures
|
||||
buildkit-version: semantic_version
|
||||
buildx-version: semantic_version
|
||||
cache-mode: boolean
|
||||
dockerfile: file_path
|
||||
dry-run: boolean
|
||||
image-name: docker_image_name
|
||||
max-retries: numeric_range_1_10
|
||||
parallel-builds: numeric_range_0_16
|
||||
platform-fallback: docker_architectures
|
||||
sbom-format: report_format
|
||||
scan-image: boolean
|
||||
sign-image: boolean
|
||||
tag: docker_tag
|
||||
token: github_token
|
||||
verbose: boolean
|
||||
overrides:
|
||||
cache-mode: cache_mode
|
||||
sbom-format: sbom_format
|
||||
statistics:
|
||||
total_inputs: 27
|
||||
validated_inputs: 17
|
||||
skipped_inputs: 0
|
||||
coverage_percentage: 63
|
||||
validation_coverage: 63
|
||||
auto_detected: true
|
||||
manual_review_required: true
|
||||
quality_indicators:
|
||||
has_required_inputs: true
|
||||
has_token_validation: true
|
||||
has_version_validation: true
|
||||
has_file_validation: true
|
||||
has_security_validation: true
|
||||
Reference in New Issue
Block a user