mirror of
https://github.com/ivuorinen/actions.git
synced 2026-01-26 03:23:59 +00:00
feat: add GitHub Actions workflows for code quality and automation (#2)
This commit is contained in:
@@ -6,11 +6,10 @@ end_of_line = lf
|
||||
indent_size = 2
|
||||
indent_style = space
|
||||
insert_final_newline = true
|
||||
max_line_length = off
|
||||
max_line_length = 120
|
||||
tab_width = 2
|
||||
trim_trailing_whitespace = true
|
||||
|
||||
[{*.php,*.json}]
|
||||
indent_size = 4
|
||||
max_line_length = 110
|
||||
tab_width = 4
|
||||
[*.md]
|
||||
max_line_length = 120
|
||||
trim_trailing_whitespace = false
|
||||
|
||||
9
.github/CODEOWNERS
vendored
9
.github/CODEOWNERS
vendored
@@ -1 +1,10 @@
|
||||
# Global owners
|
||||
* @ivuorinen
|
||||
|
||||
# Workflow files
|
||||
.github/workflows/ @ivuorinen
|
||||
|
||||
# Security files
|
||||
SECURITY.md @ivuorinen
|
||||
suppressions.xml @ivuorinen
|
||||
.gitleaks.toml @ivuorinen
|
||||
|
||||
138
.github/CODE_OF_CONDUCT.md
vendored
138
.github/CODE_OF_CONDUCT.md
vendored
@@ -2,92 +2,144 @@
|
||||
|
||||
## 1. Purpose
|
||||
|
||||
A primary goal of @ivuorinen's repositories is to be inclusive to the largest number of contributors, with the most varied and diverse backgrounds possible. As such, we are committed to providing a friendly, safe and welcoming environment for all, regardless of gender, sexual orientation, ability, ethnicity, socioeconomic status, and religion (or lack thereof).
|
||||
A primary goal of @ivuorinen's repositories is to be inclusive to the largest
|
||||
number of contributors, with the most varied and diverse backgrounds possible.
|
||||
As such, we are committed to providing a friendly, safe and welcoming
|
||||
environment for all, regardless of gender, sexual orientation, ability,
|
||||
ethnicity, socioeconomic status, and religion (or lack thereof).
|
||||
|
||||
This code of conduct outlines our expectations for all those who participate in our community, as well as the consequences for unacceptable behavior.
|
||||
This code of conduct outlines our expectations for all those who participate in
|
||||
our community, as well as the consequences for unacceptable behavior.
|
||||
|
||||
We invite all those who participate in @ivuorinen's repositories to help us create safe and positive experiences for everyone.
|
||||
We invite all those who participate in @ivuorinen's repositories to help us
|
||||
create safe and positive experiences for everyone.
|
||||
|
||||
## 2. Open [Source/Culture/Tech] Citizenship
|
||||
|
||||
A supplemental goal of this Code of Conduct is to increase open [source/culture/tech] citizenship by encouraging participants to recognize and strengthen the relationships between our actions and their effects on our community.
|
||||
A supplemental goal of this Code of Conduct is to increase
|
||||
open [source/culture/tech] citizenship by encouraging participants to recognize
|
||||
and strengthen the relationships between our actions and their effects on our
|
||||
community.
|
||||
|
||||
Communities mirror the societies in which they exist and positive action is essential to counteract the many forms of inequality and abuses of power that exist in society.
|
||||
Communities mirror the societies in which they exist and positive action is
|
||||
essential to counteract the many forms of inequality and abuses of power that
|
||||
exist in society.
|
||||
|
||||
If you see someone who is making an extra effort to ensure our community is welcoming, friendly, and encourages all participants to contribute to the fullest extent, we want to know.
|
||||
If you see someone who is making an extra effort to ensure our community is
|
||||
welcoming, friendly, and encourages all participants to contribute to the
|
||||
fullest extent, we want to know.
|
||||
|
||||
## 3. Expected Behavior
|
||||
|
||||
The following behaviors are expected and requested of all community members:
|
||||
|
||||
* Participate in an authentic and active way. In doing so, you contribute to the health and longevity of this community.
|
||||
* Exercise consideration and respect in your speech and actions.
|
||||
* Attempt collaboration before conflict.
|
||||
* Refrain from demeaning, discriminatory, or harassing behavior and speech.
|
||||
* Be mindful of your surroundings and of your fellow participants. Alert community leaders if you notice a dangerous situation, someone in distress, or violations of this Code of Conduct, even if they seem inconsequential.
|
||||
* Remember that community event venues may be shared with members of the public; please be respectful to all patrons of these locations.
|
||||
* Participate in an authentic and active way. In doing so, you contribute to the
|
||||
health and longevity of this community.
|
||||
* Exercise consideration and respect in your speech and actions.
|
||||
* Attempt collaboration before conflict.
|
||||
* Refrain from demeaning, discriminatory, or harassing behavior and speech.
|
||||
* Be mindful of your surroundings and of your fellow participants. Alert
|
||||
community leaders if you notice a dangerous situation, someone in distress, or
|
||||
violations of this Code of Conduct, even if they seem inconsequential.
|
||||
* Remember that community event venues may be shared with members of the public;
|
||||
please be respectful to all patrons of these locations.
|
||||
|
||||
## 4. Unacceptable Behavior
|
||||
|
||||
The following behaviors are considered harassment and are unacceptable within our community:
|
||||
The following behaviors are considered harassment and are unacceptable within
|
||||
our community:
|
||||
|
||||
* Violence, threats of violence or violent language directed against another person.
|
||||
* Sexist, racist, homophobic, transphobic, ableist or otherwise discriminatory jokes and language.
|
||||
* Posting or displaying sexually explicit or violent material.
|
||||
* Posting or threatening to post other people's personally identifying information ("doxing").
|
||||
* Personal insults, particularly those related to gender, sexual orientation, race, religion, or disability.
|
||||
* Inappropriate photography or recording.
|
||||
* Inappropriate physical contact. You should have someone's consent before touching them.
|
||||
* Unwelcome sexual attention. This includes, sexualized comments or jokes; inappropriate touching, groping, and unwelcomed sexual advances.
|
||||
* Deliberate intimidation, stalking or following (online or in person).
|
||||
* Advocating for, or encouraging, any of the above behavior.
|
||||
* Sustained disruption of community events, including talks and presentations.
|
||||
* Violence, threats of violence or violent language directed against another
|
||||
person.
|
||||
* Sexist, racist, homophobic, transphobic, ableist or otherwise discriminatory
|
||||
jokes and language.
|
||||
* Posting or displaying sexually explicit or violent material.
|
||||
* Posting or threatening to post other people's personally identifying
|
||||
information ("doxing").
|
||||
* Personal insults, particularly those related to gender, sexual orientation,
|
||||
race, religion, or disability.
|
||||
* Inappropriate photography or recording.
|
||||
* Inappropriate physical contact. You should have someone's consent before
|
||||
touching them.
|
||||
* Unwelcome sexual attention. This includes, sexualized comments or jokes;
|
||||
inappropriate touching, groping, and unwelcomed sexual advances.
|
||||
* Deliberate intimidation, stalking or following (online or in person).
|
||||
* Advocating for, or encouraging, any of the above behavior.
|
||||
* Sustained disruption of community events, including talks and presentations.
|
||||
|
||||
## 5. Weapons Policy
|
||||
|
||||
No weapons will be allowed at @ivuorinen's repositories events, community spaces, or in other spaces covered by the scope of this Code of Conduct. Weapons include but are not limited to guns, explosives (including fireworks), and large knives such as those used for hunting or display, as well as any other item used for the purpose of causing injury or harm to others. Anyone seen in possession of one of these items will be asked to leave immediately, and will only be allowed to return without the weapon. Community members are further expected to comply with all state and local laws on this matter.
|
||||
No weapons will be allowed at @ivuorinen's repositories events, community
|
||||
spaces, or in other spaces covered by the scope of this Code of Conduct. Weapons
|
||||
include but are not limited to guns, explosives (including fireworks), and large
|
||||
knives such as those used for hunting or display, as well as any other item used
|
||||
for the purpose of causing injury or harm to others. Anyone seen in possession
|
||||
of one of these items will be asked to leave immediately, and will only be
|
||||
allowed to return without the weapon. Community members are further expected to
|
||||
comply with all state and local laws on this matter.
|
||||
|
||||
## 6. Consequences of Unacceptable Behavior
|
||||
|
||||
Unacceptable behavior from any community member, including sponsors and those with decision-making authority, will not be tolerated.
|
||||
Unacceptable behavior from any community member, including sponsors and those
|
||||
with decision-making authority, will not be tolerated.
|
||||
|
||||
Anyone asked to stop unacceptable behavior is expected to comply immediately.
|
||||
|
||||
If a community member engages in unacceptable behavior, the community organizers may take any action they deem appropriate, up to and including a temporary ban or permanent expulsion from the community without warning (and without refund in the case of a paid event).
|
||||
If a community member engages in unacceptable behavior, the community organizers
|
||||
may take any action they deem appropriate, up to and including a temporary ban
|
||||
or permanent expulsion from the community without warning (and without refund in
|
||||
the case of a paid event).
|
||||
|
||||
## 7. Reporting Guidelines
|
||||
|
||||
If you are subject to or witness unacceptable behavior, or have any other concerns, please notify a community organizer as soon as possible. ismo@ivuorinen.net.
|
||||
If you are subject to or witness unacceptable behavior, or have any other
|
||||
concerns, please notify a community organizer as soon as possible:
|
||||
<ismo@ivuorinen.net>
|
||||
|
||||
|
||||
|
||||
Additionally, community organizers are available to help community members engage with local law enforcement or to otherwise help those experiencing unacceptable behavior feel safe. In the context of in-person events, organizers will also provide escorts as desired by the person experiencing distress.
|
||||
Additionally, community organizers are available to help community members
|
||||
engage with local law enforcement or to otherwise help those experiencing
|
||||
unacceptable behavior feel safe. In the context of in-person events, organizers
|
||||
will also provide escorts as desired by the person experiencing distress.
|
||||
|
||||
## 8. Addressing Grievances
|
||||
|
||||
If you feel you have been falsely or unfairly accused of violating this Code of Conduct, you should notify @ivuorinen with a concise description of your grievance. Your grievance will be handled in accordance with our existing governing policies.
|
||||
If you feel you have been falsely or unfairly accused of violating this Code of
|
||||
Conduct, you should notify @ivuorinen with a concise description of your
|
||||
grievance. Your grievance will be handled in accordance with our existing
|
||||
governing policies.
|
||||
|
||||
## 9. Scope
|
||||
|
||||
We expect all community participants (contributors, paid or otherwise; sponsors; and other guests) to abide by this Code of Conduct in all community venues--online and in-person--as well as in all one-on-one communications pertaining to community business.
|
||||
We expect all community participants (contributors, paid or otherwise; sponsors;
|
||||
and other guests) to abide by this Code of Conduct in all community
|
||||
venues--online and in-person--as well as in all one-on-one communications
|
||||
pertaining to community business.
|
||||
|
||||
This code of conduct and its related procedures also applies to unacceptable behavior occurring outside the scope of community activities when such behavior has the potential to adversely affect the safety and well-being of community members.
|
||||
This code of conduct and its related procedures also applies to unacceptable
|
||||
behavior occurring outside the scope of community activities when such behavior
|
||||
has the potential to adversely affect the safety and well-being of community
|
||||
members.
|
||||
|
||||
## 10. Contact info
|
||||
|
||||
@ivuorinen
|
||||
ismo@ivuorinen.net
|
||||
<ismo@ivuorinen.net>
|
||||
|
||||
## 11. License and attribution
|
||||
|
||||
The Citizen Code of Conduct is distributed by [Stumptown Syndicate](http://stumptownsyndicate.org) under a [Creative Commons Attribution-ShareAlike license](http://creativecommons.org/licenses/by-sa/3.0/).
|
||||
The Citizen Code of Conduct is distributed by [Stumptown Syndicate][stumptown]
|
||||
under a [Creative Commons Attribution-ShareAlike license][cc-by-sa].
|
||||
|
||||
Portions of text derived from the [Django Code of Conduct](https://www.djangoproject.com/conduct/) and the [Geek Feminism Anti-Harassment Policy](http://geekfeminism.wikia.com/wiki/Conference_anti-harassment/Policy).
|
||||
Portions of text derived from the [Django Code of Conduct][django] and
|
||||
the [Geek Feminism Anti-Harassment Policy][geek-feminism].
|
||||
|
||||
_Revision 2.3. Posted 6 March 2017._
|
||||
* _Revision 2.3. Posted 6 March 2017._
|
||||
* _Revision 2.2. Posted 4 February 2016._
|
||||
* _Revision 2.1. Posted 23 June 2014._
|
||||
* _Revision 2.0, adopted by the [Stumptown Syndicate][stumptown] board on 10
|
||||
January 2013. Posted 17 March 2013._
|
||||
|
||||
_Revision 2.2. Posted 4 February 2016._
|
||||
|
||||
_Revision 2.1. Posted 23 June 2014._
|
||||
|
||||
_Revision 2.0, adopted by the [Stumptown Syndicate](http://stumptownsyndicate.org) board on 10 January 2013. Posted 17 March 2013._
|
||||
[stumptown]: https://github.com/stumpsyn
|
||||
[cc-by-sa]: https://creativecommons.org/licenses/by-sa/3.0/
|
||||
[django]: https://www.djangoproject.com/conduct/
|
||||
[geek-feminism]: http://geekfeminism.wikia.com/wiki/Conference_anti-harassment/Policy
|
||||
|
||||
17
.github/ISSUE_TEMPLATE/bug_report.md
vendored
17
.github/ISSUE_TEMPLATE/bug_report.md
vendored
@@ -12,6 +12,7 @@ A clear and concise description of what the bug is.
|
||||
|
||||
**To Reproduce**
|
||||
Steps to reproduce the behavior:
|
||||
|
||||
1. Go to '...'
|
||||
2. Click on '....'
|
||||
3. Scroll down to '....'
|
||||
@@ -24,15 +25,17 @@ A clear and concise description of what you expected to happen.
|
||||
If applicable, add screenshots to help explain your problem.
|
||||
|
||||
**Desktop (please complete the following information):**
|
||||
- OS: [e.g. iOS]
|
||||
- Browser [e.g. chrome, safari]
|
||||
- Version [e.g. 22]
|
||||
|
||||
- OS: [e.g. iOS]
|
||||
- Browser [e.g. chrome, safari]
|
||||
- Version [e.g. 22]
|
||||
|
||||
**Smartphone (please complete the following information):**
|
||||
- Device: [e.g. iPhone6]
|
||||
- OS: [e.g. iOS8.1]
|
||||
- Browser [e.g. stock browser, safari]
|
||||
- Version [e.g. 22]
|
||||
|
||||
- Device: [e.g. iPhone6]
|
||||
- OS: [e.g. iOS8.1]
|
||||
- Browser [e.g. stock browser, safari]
|
||||
- Version [e.g. 22]
|
||||
|
||||
**Additional context**
|
||||
Add any other context about the problem here.
|
||||
|
||||
6
.github/ISSUE_TEMPLATE/feature_request.md
vendored
6
.github/ISSUE_TEMPLATE/feature_request.md
vendored
@@ -8,13 +8,15 @@ assignees: ivuorinen
|
||||
---
|
||||
|
||||
**Is your feature request related to a problem? Please describe.**
|
||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||
A clear and concise description of what the problem is. Ex. I'm always
|
||||
frustrated when [...]
|
||||
|
||||
**Describe the solution you'd like**
|
||||
A clear and concise description of what you want to happen.
|
||||
|
||||
**Describe alternatives you've considered**
|
||||
A clear and concise description of any alternative solutions or features you've considered.
|
||||
A clear and concise description of any alternative solutions or features you've
|
||||
considered.
|
||||
|
||||
**Additional context**
|
||||
Add any other context or screenshots about the feature request here.
|
||||
|
||||
166
.github/SECURITY.md
vendored
Normal file
166
.github/SECURITY.md
vendored
Normal file
@@ -0,0 +1,166 @@
|
||||
# Security Policy
|
||||
|
||||
## Supported Versions
|
||||
|
||||
| Version | Supported |
|
||||
|---------| ------------------ |
|
||||
| main | :white_check_mark: |
|
||||
|
||||
## Reporting a Vulnerability
|
||||
|
||||
1. **Do Not** open a public issue
|
||||
2. Email security concerns to <ismo@ivuorinen.net>
|
||||
3. Include:
|
||||
- Description of the vulnerability
|
||||
- Steps to reproduce
|
||||
- Potential impact
|
||||
- Suggested fix (if any)
|
||||
|
||||
We will respond within 48 hours and work on a fix if validated.
|
||||
|
||||
## Security Measures
|
||||
|
||||
This repository implements:
|
||||
|
||||
- CodeQL scanning
|
||||
- OWASP Dependency Check
|
||||
- Snyk vulnerability scanning
|
||||
- Gitleaks secret scanning
|
||||
- Trivy vulnerability scanner
|
||||
- MegaLinter code analysis
|
||||
- Regular security updates
|
||||
- Automated fix PRs
|
||||
- Daily security scans
|
||||
- Weekly metrics collection
|
||||
|
||||
## Vulnerability Suppressions
|
||||
|
||||
This repository uses OWASP Dependency Check for security scanning. Some vulnerabilities may be suppressed if:
|
||||
|
||||
1. They are false positives
|
||||
2. They affect only test/development dependencies
|
||||
3. They have been assessed and determined to not be exploitable in our context
|
||||
|
||||
### Suppression File
|
||||
|
||||
Suppressions are managed in `suppressions.xml` in the root directory. Each suppression must include:
|
||||
|
||||
- Detailed notes explaining why the vulnerability is suppressed
|
||||
- Specific identifiers (CVE, package, etc.)
|
||||
- Regular review date
|
||||
|
||||
### Adding New Suppressions
|
||||
|
||||
To add a new suppression:
|
||||
|
||||
1. Add the entry to `suppressions.xml`
|
||||
2. Include detailed notes explaining the reason
|
||||
3. Create a PR with the changes
|
||||
4. Get security team review
|
||||
|
||||
### Reviewing Suppressions
|
||||
|
||||
Suppressions are reviewed:
|
||||
|
||||
- Monthly during security scans
|
||||
- When related dependencies are updated
|
||||
- During security audits
|
||||
|
||||
## Security Best Practices
|
||||
|
||||
When using these actions:
|
||||
|
||||
1. Pin to commit hashes instead of tags
|
||||
2. Use least-privilege token permissions
|
||||
3. Validate all inputs
|
||||
4. Set appropriate timeouts
|
||||
5. Configure required security scanners:
|
||||
- Add `suppressions.xml` for OWASP Dependency Check
|
||||
- Add `.gitleaks.toml` for Gitleaks configuration
|
||||
|
||||
## Required Secrets
|
||||
|
||||
The following secrets should be configured in your repository:
|
||||
|
||||
| Secret Name | Description | Required |
|
||||
|-------------|-------------|----------|
|
||||
| `SNYK_TOKEN` | Token for Snyk vulnerability scanning | Optional |
|
||||
| `GITLEAKS_LICENSE` | License for Gitleaks scanning | Optional |
|
||||
| `SLACK_WEBHOOK` | Webhook URL for Slack notifications | Optional |
|
||||
| `SONAR_TOKEN` | Token for SonarCloud analysis | Optional |
|
||||
| `FIXIMUS_TOKEN` | Token for automated fixes | Optional |
|
||||
|
||||
## Security Workflows
|
||||
|
||||
This repository includes several security-focused workflows:
|
||||
|
||||
1. **Daily Security Checks** (`security.yml`)
|
||||
- Runs comprehensive security scans
|
||||
- Creates automated fix PRs
|
||||
- Generates security reports
|
||||
|
||||
2. **Action Security** (`action-security.yml`)
|
||||
- Validates GitHub Action files
|
||||
- Checks for hardcoded credentials
|
||||
- Scans for vulnerabilities
|
||||
|
||||
3. **CodeQL Analysis** (`codeql.yml`)
|
||||
- Analyzes code for security issues
|
||||
- Runs on multiple languages
|
||||
- Weekly scheduled scans
|
||||
|
||||
4. **Security Metrics** (`security-metrics.yml`)
|
||||
- Collects security metrics
|
||||
- Generates trend reports
|
||||
- Weekly analysis
|
||||
|
||||
## Security Reports
|
||||
|
||||
Security scan results are available as:
|
||||
|
||||
1. SARIF reports in GitHub Security tab
|
||||
2. Artifacts in workflow runs
|
||||
3. Automated issues for critical findings
|
||||
4. Weekly trend reports
|
||||
5. Security metrics dashboard
|
||||
|
||||
## Automated Fixes
|
||||
|
||||
The repository automatically:
|
||||
|
||||
1. Creates PRs for fixable vulnerabilities
|
||||
2. Updates dependencies with security issues
|
||||
3. Fixes code security issues
|
||||
4. Creates detailed fix documentation
|
||||
|
||||
## Regular Reviews
|
||||
|
||||
We conduct:
|
||||
|
||||
1. Daily automated security scans
|
||||
2. Weekly metrics analysis
|
||||
3. Monthly suppression reviews
|
||||
4. Regular dependency updates
|
||||
|
||||
## Contributing
|
||||
|
||||
When contributing to this repository:
|
||||
|
||||
1. Follow security best practices
|
||||
2. Do not commit sensitive information
|
||||
3. Use provided security tools
|
||||
4. Review security documentation
|
||||
|
||||
## Support
|
||||
|
||||
For security-related questions:
|
||||
|
||||
1. Review existing security documentation
|
||||
2. Check closed security issues
|
||||
3. Contact security team at <ismo@ivuorinen.net>
|
||||
|
||||
Do not open public issues for security concerns.
|
||||
|
||||
## License
|
||||
|
||||
The security policy and associated tools are covered under the repository's MIT license.
|
||||
20
.github/renovate.json
vendored
20
.github/renovate.json
vendored
@@ -1,6 +1,20 @@
|
||||
{
|
||||
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
|
||||
"extends": [
|
||||
"github>ivuorinen/renovate-config"
|
||||
]
|
||||
"extends": ["github>ivuorinen/renovate-config"],
|
||||
"packageRules": [
|
||||
{
|
||||
"matchUpdateTypes": ["minor", "patch"],
|
||||
"matchCurrentVersion": "!/^0/",
|
||||
"automerge": true
|
||||
},
|
||||
{
|
||||
"matchDepTypes": ["devDependencies"],
|
||||
"automerge": true
|
||||
}
|
||||
],
|
||||
"schedule": ["every weekend"],
|
||||
"vulnerabilityAlerts": {
|
||||
"labels": ["security"],
|
||||
"assignees": ["ivuorinen"]
|
||||
}
|
||||
}
|
||||
|
||||
260
.github/workflows/action-security.yml
vendored
Normal file
260
.github/workflows/action-security.yml
vendored
Normal file
@@ -0,0 +1,260 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Action Security
|
||||
|
||||
on:
|
||||
push:
|
||||
paths:
|
||||
- '**/action.yml'
|
||||
- '**/action.yaml'
|
||||
pull_request:
|
||||
paths:
|
||||
- '**/action.yml'
|
||||
- '**/action.yaml'
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze Action Security
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
security-events: write
|
||||
actions: read
|
||||
pull-requests: read
|
||||
statuses: write
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Check Required Configurations
|
||||
id: check-configs
|
||||
shell: bash
|
||||
run: |
|
||||
# Initialize all flags as false
|
||||
{
|
||||
echo "run_gitleaks=false"
|
||||
echo "run_trivy=true"
|
||||
} >> "$GITHUB_OUTPUT"
|
||||
|
||||
# Check Gitleaks configuration and license
|
||||
if [ -f ".gitleaks.toml" ] && [ -n "${{ secrets.GITLEAKS_LICENSE }}" ]; then
|
||||
echo "Gitleaks config and license found"
|
||||
echo "run_gitleaks=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "::warning::Gitleaks config or license missing - skipping Gitleaks scan"
|
||||
fi
|
||||
|
||||
- name: Run actionlint
|
||||
uses: raven-actions/actionlint@v2
|
||||
with:
|
||||
cache: true
|
||||
fail-on-error: true
|
||||
shellcheck: false
|
||||
|
||||
- name: Run Gitleaks
|
||||
if: steps.check-configs.outputs.run_gitleaks == 'true'
|
||||
uses: gitleaks/gitleaks-action@v2.3.2
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
GITLEAKS_LICENSE: ${{ secrets.GITLEAKS_LICENSE }}
|
||||
with:
|
||||
config-path: .gitleaks.toml
|
||||
report-format: sarif
|
||||
report-path: gitleaks-report.sarif
|
||||
|
||||
- name: Run Trivy vulnerability scanner
|
||||
uses: aquasecurity/trivy-action@master
|
||||
with:
|
||||
scan-type: 'fs'
|
||||
security-checks: 'vuln,config,secret'
|
||||
format: 'sarif'
|
||||
output: 'trivy-results.sarif'
|
||||
severity: 'CRITICAL,HIGH'
|
||||
timeout: '10m'
|
||||
|
||||
- name: Verify SARIF files
|
||||
id: verify-sarif
|
||||
shell: bash
|
||||
run: |
|
||||
# Initialize outputs
|
||||
{
|
||||
echo "has_trivy=false"
|
||||
echo "has_gitleaks=false"
|
||||
} >> "$GITHUB_OUTPUT"
|
||||
|
||||
# Check Trivy results
|
||||
if [ -f "trivy-results.sarif" ]; then
|
||||
if jq -e . </dev/null 2>&1 <"trivy-results.sarif"; then
|
||||
echo "has_trivy=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "::warning::Trivy SARIF file exists but is not valid JSON"
|
||||
fi
|
||||
fi
|
||||
|
||||
# Check Gitleaks results if it ran
|
||||
if [ "${{ steps.check-configs.outputs.run_gitleaks }}" = "true" ]; then
|
||||
if [ -f "gitleaks-report.sarif" ]; then
|
||||
if jq -e . </dev/null 2>&1 <"gitleaks-report.sarif"; then
|
||||
echo "has_gitleaks=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "::warning::Gitleaks SARIF file exists but is not valid JSON"
|
||||
fi
|
||||
fi
|
||||
fi
|
||||
|
||||
- name: Upload Trivy results
|
||||
if: steps.verify-sarif.outputs.has_trivy == 'true'
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: 'trivy-results.sarif'
|
||||
category: 'trivy'
|
||||
|
||||
- name: Upload Gitleaks results
|
||||
if: steps.verify-sarif.outputs.has_gitleaks == 'true'
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: 'gitleaks-report.sarif'
|
||||
category: 'gitleaks'
|
||||
|
||||
- name: Archive security reports
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: security-reports-${{ github.run_id }}
|
||||
path: |
|
||||
${{ steps.verify-sarif.outputs.has_trivy == 'true' && 'trivy-results.sarif' || '' }}
|
||||
${{ steps.verify-sarif.outputs.has_gitleaks == 'true' && 'gitleaks-report.sarif' || '' }}
|
||||
retention-days: 30
|
||||
|
||||
- name: Analyze Results
|
||||
if: always()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
|
||||
try {
|
||||
let totalIssues = 0;
|
||||
let criticalIssues = 0;
|
||||
|
||||
const analyzeSarif = (file, tool) => {
|
||||
if (!fs.existsSync(file)) {
|
||||
console.log(`No results file found for ${tool}`);
|
||||
return null;
|
||||
}
|
||||
|
||||
try {
|
||||
const sarif = JSON.parse(fs.readFileSync(file, 'utf8'));
|
||||
return sarif.runs.reduce((acc, run) => {
|
||||
if (!run.results) return acc;
|
||||
|
||||
const critical = run.results.filter(r =>
|
||||
r.level === 'error' ||
|
||||
r.level === 'critical' ||
|
||||
(r.ruleId || '').toLowerCase().includes('critical')
|
||||
).length;
|
||||
|
||||
return {
|
||||
total: acc.total + run.results.length,
|
||||
critical: acc.critical + critical
|
||||
};
|
||||
}, { total: 0, critical: 0 });
|
||||
} catch (error) {
|
||||
console.log(`Error analyzing ${tool} results: ${error.message}`);
|
||||
return null;
|
||||
}
|
||||
};
|
||||
|
||||
// Only analyze results from tools that ran successfully
|
||||
const results = {
|
||||
trivy: ${{ steps.verify-sarif.outputs.has_trivy }} ?
|
||||
analyzeSarif('trivy-results.sarif', 'trivy') : null,
|
||||
gitleaks: ${{ steps.verify-sarif.outputs.has_gitleaks }} ?
|
||||
analyzeSarif('gitleaks-report.sarif', 'gitleaks') : null
|
||||
};
|
||||
|
||||
// Aggregate results
|
||||
Object.entries(results).forEach(([tool, result]) => {
|
||||
if (result) {
|
||||
totalIssues += result.total;
|
||||
criticalIssues += result.critical;
|
||||
console.log(`${tool}: ${result.total} total, ${result.critical} critical issues`);
|
||||
}
|
||||
});
|
||||
|
||||
// Create summary
|
||||
const summary = `## Security Scan Summary
|
||||
|
||||
- Total Issues Found: ${totalIssues}
|
||||
- Critical Issues: ${criticalIssues}
|
||||
|
||||
### Tool Breakdown
|
||||
${Object.entries(results)
|
||||
.filter(([_, r]) => r)
|
||||
.map(([tool, r]) =>
|
||||
`- ${tool}: ${r.total} total, ${r.critical} critical`
|
||||
).join('\n')}
|
||||
|
||||
### Tools Run Status
|
||||
- Trivy: ${{ steps.verify-sarif.outputs.has_trivy }}
|
||||
- Gitleaks: ${{ steps.check-configs.outputs.run_gitleaks }}
|
||||
`;
|
||||
|
||||
// Set output
|
||||
core.setOutput('total_issues', totalIssues);
|
||||
core.setOutput('critical_issues', criticalIssues);
|
||||
|
||||
// Add job summary
|
||||
await core.summary
|
||||
.addRaw(summary)
|
||||
.write();
|
||||
|
||||
// Fail if critical issues found
|
||||
if (criticalIssues > 0) {
|
||||
core.setFailed(`Found ${criticalIssues} critical security issues`);
|
||||
}
|
||||
} catch (error) {
|
||||
core.setFailed(`Analysis failed: ${error.message}`);
|
||||
}
|
||||
|
||||
- name: Notify on Critical Issues
|
||||
if: failure()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const critical = core.getInput('critical_issues');
|
||||
|
||||
const body = `🚨 Critical security issues found in GitHub Actions
|
||||
|
||||
${critical} critical security issues were found during the security scan.
|
||||
|
||||
### Scan Results
|
||||
- Trivy: ${{ steps.verify-sarif.outputs.has_trivy == 'true' && 'Completed' || 'Skipped/Failed' }}
|
||||
- Gitleaks: ${{ steps.check-configs.outputs.run_gitleaks == 'true' && 'Completed' || 'Skipped' }}
|
||||
|
||||
[View detailed scan results](https://github.com/${owner}/${repo}/actions/runs/${context.runId})
|
||||
|
||||
Please address these issues immediately.
|
||||
|
||||
> Note: Some security tools might have been skipped due to missing configurations.
|
||||
> Check the workflow run for details.`;
|
||||
|
||||
await github.rest.issues.create({
|
||||
owner,
|
||||
repo,
|
||||
title: '🚨 Critical Security Issues in Actions',
|
||||
body,
|
||||
labels: ['security', 'critical', 'actions'],
|
||||
assignees: ['ivuorinen']
|
||||
});
|
||||
231
.github/workflows/auto-approve.yml
vendored
Normal file
231
.github/workflows/auto-approve.yml
vendored
Normal file
@@ -0,0 +1,231 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Auto Approve
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types:
|
||||
- opened
|
||||
- reopened
|
||||
- synchronize
|
||||
- ready_for_review
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
auto-approve:
|
||||
name: 👍 Auto Approve
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 5
|
||||
|
||||
permissions:
|
||||
pull-requests: write
|
||||
contents: read
|
||||
|
||||
steps:
|
||||
- name: Check Required Secrets
|
||||
id: check-secrets
|
||||
run: |
|
||||
if [ -z "${{ secrets.APP_ID }}" ] || [ -z "${{ secrets.APP_PRIVATE_KEY }}" ]; then
|
||||
echo "::warning::GitHub App credentials not configured. Using GITHUB_TOKEN with limited functionality."
|
||||
echo "use_github_token=true" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "use_github_token=false" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Generate Token
|
||||
id: generate-token
|
||||
if: steps.check-secrets.outputs.use_github_token == 'false'
|
||||
uses: actions/create-github-app-token@v1
|
||||
with:
|
||||
app-id: ${{ secrets.APP_ID }}
|
||||
private-key: ${{ secrets.APP_PRIVATE_KEY }}
|
||||
|
||||
- name: Add Initial Status Comment
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const pr = context.payload.pull_request;
|
||||
|
||||
# shellcheck disable=SC2016
|
||||
const token_type = '${{ steps.check-secrets.outputs.use_github_token }}' === 'true'
|
||||
? 'GITHUB_TOKEN (limited functionality)'
|
||||
: 'GitHub App token';
|
||||
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: `⏳ Checking PR eligibility for auto-approval using ${token_type}...`
|
||||
});
|
||||
|
||||
- name: Check PR Eligibility
|
||||
id: check
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const pr = context.payload.pull_request;
|
||||
|
||||
// Configuration for trusted conditions
|
||||
const trustedAuthors = ['dependabot[bot]', 'renovate[bot]', 'fiximus'];
|
||||
const trustedLabels = ['dependencies', 'automated-pr'];
|
||||
const excludedLabels = ['do-not-merge', 'work-in-progress'];
|
||||
const trustedPaths = ['package.json', 'package-lock.json', 'yarn.lock', 'pnpm-lock.yaml'];
|
||||
|
||||
// Results object to store all check results
|
||||
const results = {
|
||||
isTrustedAuthor: false,
|
||||
hasRequiredLabel: false,
|
||||
hasExcludedLabel: false,
|
||||
onlyTrustedFiles: false,
|
||||
limitedPermissions: '${{ steps.check-secrets.outputs.use_github_token }}' === 'true'
|
||||
};
|
||||
|
||||
// Check author
|
||||
results.isTrustedAuthor = trustedAuthors.includes(pr.user.login);
|
||||
|
||||
// Check labels
|
||||
results.hasRequiredLabel = pr.labels.some(label =>
|
||||
trustedLabels.includes(label.name)
|
||||
);
|
||||
|
||||
results.hasExcludedLabel = pr.labels.some(label =>
|
||||
excludedLabels.includes(label.name)
|
||||
);
|
||||
|
||||
try {
|
||||
// Get changed files
|
||||
const files = await github.rest.pulls.listFiles({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr.number
|
||||
});
|
||||
|
||||
// Check if only trusted paths are modified
|
||||
results.onlyTrustedFiles = files.data.every(file =>
|
||||
trustedPaths.some(path => file.filename.includes(path))
|
||||
);
|
||||
} catch (error) {
|
||||
console.error('Error checking files:', error);
|
||||
results.onlyTrustedFiles = false;
|
||||
}
|
||||
|
||||
// Store detailed results for the next step
|
||||
core.setOutput('results', JSON.stringify(results));
|
||||
|
||||
// Set final approval decision
|
||||
const shouldApprove = results.isTrustedAuthor &&
|
||||
results.hasRequiredLabel &&
|
||||
!results.hasExcludedLabel &&
|
||||
results.onlyTrustedFiles;
|
||||
|
||||
core.setOutput('should_approve', shouldApprove);
|
||||
|
||||
// Log results
|
||||
console.log('Eligibility check results:', results);
|
||||
|
||||
- name: Process Auto Approval
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const pr = context.payload.pull_request;
|
||||
|
||||
// Parse check results
|
||||
const results = JSON.parse('${{ steps.check.outputs.results }}');
|
||||
const shouldApprove = '${{ steps.check.outputs.should_approve }}' === 'true';
|
||||
|
||||
// Create status report
|
||||
let statusReport = `## 🔍 Auto Approval Check Results\n\n`;
|
||||
|
||||
if (results.limitedPermissions) {
|
||||
statusReport += `⚠️ **Note:** Running with limited permissions (GITHUB_TOKEN)\n\n`;
|
||||
}
|
||||
|
||||
statusReport += `### Checks\n`;
|
||||
statusReport += `- Trusted Author: ${results.isTrustedAuthor ? '✅' : '❌'}\n`;
|
||||
statusReport += `- Required Labels: ${results.hasRequiredLabel ? '✅' : '❌'}\n`;
|
||||
statusReport += `- Excluded Labels: ${!results.hasExcludedLabel ? '✅' : '❌'}\n`;
|
||||
statusReport += `- Trusted Files Only: ${results.onlyTrustedFiles ? '✅' : '❌'}\n\n`;
|
||||
|
||||
if (shouldApprove) {
|
||||
statusReport += `### ✅ Result: Auto-approved\n`;
|
||||
|
||||
try {
|
||||
// Create approval review
|
||||
await github.rest.pulls.createReview({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr.number,
|
||||
event: 'APPROVE',
|
||||
body: 'Automatically approved based on trusted conditions.'
|
||||
});
|
||||
|
||||
// Add auto-merge label
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
labels: ['auto-merge']
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error during approval:', error);
|
||||
statusReport += `\n⚠️ Error during approval process: ${error.message}\n`;
|
||||
}
|
||||
} else {
|
||||
statusReport += `### ❌ Result: Not eligible for auto-approval\n`;
|
||||
|
||||
if (results.limitedPermissions) {
|
||||
statusReport += `\n⚠️ Note: Some functionality may be limited due to running with GITHUB_TOKEN.\n`;
|
||||
statusReport += `Configure APP_ID and APP_PRIVATE_KEY for full functionality.\n`;
|
||||
}
|
||||
}
|
||||
|
||||
// Add final status comment
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: statusReport
|
||||
});
|
||||
|
||||
- name: Handle Errors
|
||||
if: failure()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const pr = context.payload.pull_request;
|
||||
const isLimitedPermissions = '${{ steps.check-secrets.outputs.use_github_token }}' === 'true';
|
||||
|
||||
const errorMessage = `## ❌ Auto Approval Error
|
||||
|
||||
The auto-approval process encountered an error.
|
||||
|
||||
### Troubleshooting
|
||||
- Check the [workflow logs](${process.env.GITHUB_SERVER_URL}/${owner}/${repo}/actions/runs/${process.env.GITHUB_RUN_ID})
|
||||
- Verify repository permissions
|
||||
- Ensure all required configurations are present
|
||||
|
||||
${isLimitedPermissions ? '⚠️ Note: Running with limited permissions (GITHUB_TOKEN)' : ''}
|
||||
`;
|
||||
|
||||
try {
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: errorMessage
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Failed to create error comment:', error);
|
||||
core.setFailed(`Failed to create error comment: ${error.message}`);
|
||||
}
|
||||
175
.github/workflows/auto-merge.yml
vendored
Normal file
175
.github/workflows/auto-merge.yml
vendored
Normal file
@@ -0,0 +1,175 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Auto Merge
|
||||
|
||||
on:
|
||||
pull_request_target:
|
||||
types:
|
||||
- opened
|
||||
- synchronize
|
||||
- reopened
|
||||
- labeled
|
||||
- unlabeled
|
||||
check_suite:
|
||||
types:
|
||||
- completed
|
||||
status: {}
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: false # Don't cancel as this could leave PRs in inconsistent state
|
||||
|
||||
jobs:
|
||||
auto-merge:
|
||||
name: 🤝 Auto Merge
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 5
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
checks: read
|
||||
statuses: read
|
||||
|
||||
steps:
|
||||
- name: Check Required Secrets
|
||||
id: check-secrets
|
||||
run: |
|
||||
# shellcheck disable=SC2016
|
||||
if [ -z "${{ secrets.APP_ID }}" ] || [ -z "${{ secrets.APP_PRIVATE_KEY }}" ]; then
|
||||
echo "::warning::GitHub App credentials not configured. Using GITHUB_TOKEN instead."
|
||||
echo "use_github_token=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "use_github_token=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Generate Token
|
||||
id: generate-token
|
||||
if: steps.check-secrets.outputs.use_github_token == 'false'
|
||||
uses: actions/create-github-app-token@v1
|
||||
with:
|
||||
app-id: ${{ secrets.APP_ID }}
|
||||
private-key: ${{ secrets.APP_PRIVATE_KEY }}
|
||||
|
||||
- name: Auto Merge PR
|
||||
uses: pascalgn/automerge-action@v0.15.6
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
MERGE_LABELS: 'dependencies,automated-pr,!work-in-progress,!do-not-merge'
|
||||
MERGE_METHOD: 'squash'
|
||||
MERGE_COMMIT_MESSAGE: 'pull-request-title'
|
||||
MERGE_RETRIES: '6'
|
||||
MERGE_RETRY_SLEEP: '10000'
|
||||
MERGE_REQUIRED_APPROVALS: '0'
|
||||
MERGE_DELETE_BRANCH: 'true'
|
||||
UPDATE_LABELS: 'automerge'
|
||||
UPDATE_METHOD: 'rebase'
|
||||
MERGE_ERROR_FAIL: 'false'
|
||||
|
||||
- name: Check Merge Status
|
||||
if: always()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const pr = context.payload.pull_request;
|
||||
|
||||
if (!pr) return;
|
||||
|
||||
try {
|
||||
const status = await github.rest.pulls.get({
|
||||
owner,
|
||||
repo,
|
||||
pull_number: pr.number
|
||||
});
|
||||
|
||||
if (status.data.merged) {
|
||||
console.log(`PR #${pr.number} was successfully merged`);
|
||||
|
||||
// Add merge success comment
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: '✅ Successfully auto-merged! Branch will be deleted.'
|
||||
});
|
||||
} else {
|
||||
console.log(`PR #${pr.number} is not merged. State: ${status.data.state}`);
|
||||
|
||||
// Check merge blockers
|
||||
if (status.data.mergeable_state === 'blocked') {
|
||||
console.log('PR is blocked from merging. Check branch protection rules.');
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: '⚠️ Auto-merge is blocked. Please check branch protection rules and resolve any conflicts.'
|
||||
});
|
||||
}
|
||||
|
||||
// Check if using reduced permissions
|
||||
if ('${{ steps.check-secrets.outputs.use_github_token }}' === 'true') {
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: '⚠️ Note: Running with reduced permissions as GitHub App credentials are not configured.'
|
||||
});
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error checking merge status:', error);
|
||||
core.setFailed(`Failed to check merge status: ${error.message}`);
|
||||
|
||||
// Add error comment to PR
|
||||
try {
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: `❌ Error checking merge status: ${error.message}`
|
||||
});
|
||||
} catch (commentError) {
|
||||
console.error('Failed to add error comment:', commentError);
|
||||
}
|
||||
}
|
||||
|
||||
- name: Remove Labels on Failure
|
||||
if: failure()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
const pr = context.payload.pull_request;
|
||||
|
||||
if (!pr) return;
|
||||
|
||||
try {
|
||||
// Remove automerge label
|
||||
await github.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
name: 'automerge'
|
||||
}).catch(e => console.log('automerge label not found'));
|
||||
|
||||
// Add merge-failed label
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
labels: ['merge-failed']
|
||||
});
|
||||
|
||||
// Add failure comment
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: pr.number,
|
||||
body: '❌ Auto-merge failed. The automerge label has been removed and merge-failed label added.'
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Error handling merge failure:', error);
|
||||
}
|
||||
183
.github/workflows/auto-rebase.yml
vendored
Normal file
183
.github/workflows/auto-rebase.yml
vendored
Normal file
@@ -0,0 +1,183 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Auto Rebase
|
||||
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
- master
|
||||
pull_request_target:
|
||||
types:
|
||||
- labeled
|
||||
issue_comment:
|
||||
types:
|
||||
- created
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
auto-rebase:
|
||||
name: 🔄 Auto Rebase
|
||||
if: |
|
||||
(github.event_name == 'issue_comment' &&
|
||||
github.event.issue.pull_request &&
|
||||
contains(github.event.comment.body, '/rebase')) ||
|
||||
(github.event_name == 'pull_request_target' &&
|
||||
contains(github.event.pull_request.labels.*.name, 'auto-rebase')) ||
|
||||
github.event_name == 'push'
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 5
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- name: Check Required Secrets
|
||||
id: check-secrets
|
||||
run: |
|
||||
if [ -z "${{ secrets.APP_ID }}" ] || [ -z "${{ secrets.APP_PRIVATE_KEY }}" ]; then
|
||||
echo "::warning::GitHub App credentials not configured. Using GITHUB_TOKEN instead."
|
||||
echo "use_github_token=true" >> "$GITHUB_OUTPUT"
|
||||
|
||||
if [ "${{ github.event_name }}" == "push" ]; then
|
||||
echo "::warning::Running with GITHUB_TOKEN on push events may have limited functionality."
|
||||
fi
|
||||
else
|
||||
echo "use_github_token=false" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Generate Token
|
||||
id: generate-token
|
||||
if: steps.check-secrets.outputs.use_github_token == 'false'
|
||||
uses: actions/create-github-app-token@v1
|
||||
with:
|
||||
app-id: ${{ secrets.APP_ID }}
|
||||
private-key: ${{ secrets.APP_PRIVATE_KEY }}
|
||||
|
||||
- name: Add Initial Comment
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
|
||||
// Get PR number based on event type
|
||||
let prNumber;
|
||||
if (context.eventName === 'issue_comment') {
|
||||
prNumber = context.payload.issue.number;
|
||||
} else if (context.eventName === 'pull_request_target') {
|
||||
prNumber = context.payload.pull_request.number;
|
||||
}
|
||||
|
||||
if (prNumber) {
|
||||
const token_type = '${{ steps.check-secrets.outputs.use_github_token }}' === 'true'
|
||||
? 'GITHUB_TOKEN (limited permissions)'
|
||||
: 'GitHub App token';
|
||||
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: prNumber,
|
||||
body: `🔄 Attempting rebase using ${token_type}...`
|
||||
});
|
||||
}
|
||||
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Auto Rebase
|
||||
id: rebase
|
||||
continue-on-error: true
|
||||
uses: cirrus-actions/rebase@1.8
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
|
||||
- name: Handle Rebase Result
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
github-token: ${{ steps.check-secrets.outputs.use_github_token == 'true' && github.token || steps.generate-token.outputs.token }}
|
||||
script: |
|
||||
const { repo, owner } = context.repo;
|
||||
|
||||
// Get PR number based on event type
|
||||
let prNumber;
|
||||
if (context.eventName === 'issue_comment') {
|
||||
prNumber = context.payload.issue.number;
|
||||
} else if (context.eventName === 'pull_request_target') {
|
||||
prNumber = context.payload.pull_request.number;
|
||||
}
|
||||
|
||||
if (prNumber) {
|
||||
const rebaseSuccess = '${{ steps.rebase.outcome }}' === 'success';
|
||||
const usingGithubToken = '${{ steps.check-secrets.outputs.use_github_token }}' === 'true';
|
||||
|
||||
let commentBody;
|
||||
if (rebaseSuccess) {
|
||||
commentBody = '✅ Rebase completed successfully!';
|
||||
} else {
|
||||
commentBody = '❌ Rebase failed.\n\n';
|
||||
|
||||
if (usingGithubToken) {
|
||||
commentBody += '⚠️ Note: This workflow is running with reduced permissions (GITHUB_TOKEN).\n' +
|
||||
'For better functionality, configure APP_ID and APP_PRIVATE_KEY secrets.\n\n';
|
||||
}
|
||||
|
||||
commentBody += 'Please try to:\n' +
|
||||
'1. Resolve any conflicts manually\n' +
|
||||
'2. Ensure branch is not protected\n' +
|
||||
'3. Verify you have proper permissions';
|
||||
}
|
||||
|
||||
// Add result comment
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: prNumber,
|
||||
body: commentBody
|
||||
});
|
||||
|
||||
// Handle labels
|
||||
try {
|
||||
// Remove auto-rebase label if it exists
|
||||
await github.rest.issues.removeLabel({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: prNumber,
|
||||
name: 'auto-rebase'
|
||||
}).catch(e => console.log('auto-rebase label not found'));
|
||||
|
||||
// Add appropriate result label
|
||||
const resultLabel = rebaseSuccess ? 'rebase-succeeded' : 'rebase-failed';
|
||||
await github.rest.issues.addLabels({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: prNumber,
|
||||
labels: [resultLabel]
|
||||
});
|
||||
} catch (error) {
|
||||
console.log('Error handling labels:', error);
|
||||
}
|
||||
}
|
||||
|
||||
// Set action status based on rebase result
|
||||
if ('${{ steps.rebase.outcome }}' !== 'success') {
|
||||
core.setFailed('Rebase failed');
|
||||
}
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
run: |
|
||||
# Reset any pending changes
|
||||
git reset --hard
|
||||
git checkout main
|
||||
|
||||
# Clean up any temporary branches
|
||||
git fetch --prune
|
||||
43
.github/workflows/codeql.yml
vendored
Normal file
43
.github/workflows/codeql.yml
vendored
Normal file
@@ -0,0 +1,43 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: 'CodeQL'
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: ['main']
|
||||
pull_request:
|
||||
branches: ['main']
|
||||
schedule:
|
||||
- cron: '30 1 * * 0' # Run at 1:30 AM UTC every Sunday
|
||||
|
||||
jobs:
|
||||
analyze:
|
||||
name: Analyze
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
actions: read
|
||||
contents: read
|
||||
security-events: write
|
||||
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
language: ['javascript'] # Add languages used in your actions
|
||||
|
||||
steps:
|
||||
- name: Checkout repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Initialize CodeQL
|
||||
uses: github/codeql-action/init@v3
|
||||
with:
|
||||
languages: ${{ matrix.language }}
|
||||
queries: security-and-quality
|
||||
|
||||
- name: Autobuild
|
||||
uses: github/codeql-action/autobuild@v3
|
||||
|
||||
- name: Perform CodeQL Analysis
|
||||
uses: github/codeql-action/analyze@v3
|
||||
with:
|
||||
category: '/language:${{matrix.language}}'
|
||||
14
.github/workflows/dependency-review.yml
vendored
Normal file
14
.github/workflows/dependency-review.yml
vendored
Normal file
@@ -0,0 +1,14 @@
|
||||
name: 'Dependency Review'
|
||||
on: [pull_request]
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
|
||||
jobs:
|
||||
dependency-review:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: 'Checkout Repository'
|
||||
uses: actions/checkout@v4
|
||||
- name: 'Dependency Review'
|
||||
uses: actions/dependency-review-action@v4.5.0
|
||||
211
.github/workflows/pr-lint.yml
vendored
211
.github/workflows/pr-lint.yml
vendored
@@ -1,17 +1,208 @@
|
||||
---
|
||||
name: PR Lint
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: MegaLinter
|
||||
|
||||
on:
|
||||
push:
|
||||
branches-ignore: [master, main]
|
||||
pull_request:
|
||||
branches: [master, main]
|
||||
branches:
|
||||
- main
|
||||
- master
|
||||
paths-ignore:
|
||||
- '**.md'
|
||||
- 'docs/**'
|
||||
- '.github/*.md'
|
||||
- 'LICENSE'
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
packages: read
|
||||
statuses: write
|
||||
pull_request:
|
||||
branches:
|
||||
- main
|
||||
- master
|
||||
paths-ignore:
|
||||
- '**.md'
|
||||
- 'docs/**'
|
||||
- '.github/*.md'
|
||||
- 'LICENSE'
|
||||
|
||||
env:
|
||||
# Apply linter fixes configuration
|
||||
APPLY_FIXES: all
|
||||
APPLY_FIXES_EVENT: pull_request
|
||||
APPLY_FIXES_MODE: commit
|
||||
|
||||
# Disable linters that do not work or conflict
|
||||
DISABLE_LINTERS: REPOSITORY_DEVSKIM
|
||||
|
||||
# Additional settings
|
||||
VALIDATE_ALL_CODEBASE: ${{ github.event_name == 'push' && github.ref == 'refs/heads/main' }}
|
||||
GITHUB_TOKEN: ${{ secrets.FIXIMUS_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
|
||||
# Report configuration
|
||||
REPORT_OUTPUT_FOLDER: megalinter-reports
|
||||
ENABLE_SUMMARY_REPORTER: true
|
||||
ENABLE_SARIF_REPORTER: true
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
jobs:
|
||||
SuperLinter:
|
||||
uses: ivuorinen/.github/.github/workflows/pr-lint.yml@main
|
||||
megalinter:
|
||||
name: MegaLinter
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
issues: write
|
||||
pull-requests: write
|
||||
security-events: write
|
||||
|
||||
steps:
|
||||
- name: Checkout Code
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
token: ${{ secrets.PAT || secrets.GITHUB_TOKEN }}
|
||||
fetch-depth: 0
|
||||
|
||||
- name: MegaLinter
|
||||
id: ml
|
||||
uses: oxsecurity/megalinter/flavors/cupcake@v8.4.0
|
||||
env:
|
||||
PARALLEL: true # Run linters in parallel
|
||||
FILTER_REGEX_EXCLUDE: '(\.automation/test|docs/json-schemas|\.github/workflows)'
|
||||
|
||||
# Error configuration
|
||||
ERROR_ON_MISSING_EXEC_BIT: true
|
||||
CLEAR_REPORT_FOLDER: true
|
||||
PRINT_ALPACA: false
|
||||
SHOW_ELAPSED_TIME: true
|
||||
|
||||
# File configuration
|
||||
YAML_YAMLLINT_CONFIG_FILE: .yamllint.yml
|
||||
YAML_PRETTIER_CONFIG_FILE: .prettierrc.yml
|
||||
YAML_YAMLLINT_FILTER_REGEX_EXCLUDE: '(\.automation/test|docs/json-schemas|\.github/workflows)'
|
||||
|
||||
- name: Check MegaLinter Results
|
||||
id: check-results
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
echo "status=success" >> "$GITHUB_OUTPUT"
|
||||
|
||||
if [ -f "${{ env.REPORT_OUTPUT_FOLDER }}/megalinter.log" ]; then
|
||||
if grep -q "ERROR\|CRITICAL" "${{ env.REPORT_OUTPUT_FOLDER }}/megalinter.log"; then
|
||||
echo "Linting errors found"
|
||||
echo "status=failure" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
else
|
||||
echo "::warning::MegaLinter log file not found"
|
||||
fi
|
||||
|
||||
- name: Upload Reports
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: MegaLinter reports
|
||||
path: |
|
||||
megalinter-reports
|
||||
mega-linter.log
|
||||
retention-days: 30
|
||||
|
||||
- name: Upload SARIF Report
|
||||
if: always() && hashFiles('megalinter-reports/sarif/*.sarif')
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: megalinter-reports/sarif
|
||||
category: megalinter
|
||||
|
||||
- name: Prepare Git for Fixes
|
||||
if: steps.ml.outputs.has_updated_sources == 1
|
||||
shell: bash
|
||||
run: |
|
||||
sudo chown -Rc $UID .git/
|
||||
git config --global user.name "fiximus"
|
||||
git config --global user.email "github-bot@ivuorinen.net"
|
||||
|
||||
- name: Create Pull Request
|
||||
if: |
|
||||
steps.ml.outputs.has_updated_sources == 1 &&
|
||||
(env.APPLY_FIXES_EVENT == 'all' || env.APPLY_FIXES_EVENT == github.event_name) &&
|
||||
env.APPLY_FIXES_MODE == 'pull_request' &&
|
||||
(github.event_name == 'push' || github.event.pull_request.head.repo.full_name == github.repository) &&
|
||||
!contains(github.event.head_commit.message, 'skip fix')
|
||||
uses: peter-evans/create-pull-request@v6
|
||||
id: cpr
|
||||
with:
|
||||
token: ${{ secrets.FIXIMUS_TOKEN || secrets.GITHUB_TOKEN }}
|
||||
commit-message: '[MegaLinter] Apply linters automatic fixes'
|
||||
title: '[MegaLinter] Apply linters automatic fixes'
|
||||
labels: bot
|
||||
branch: megalinter/fixes-${{ github.ref_name }}
|
||||
branch-suffix: timestamp
|
||||
delete-branch: true
|
||||
body: |
|
||||
## MegaLinter Fixes
|
||||
|
||||
MegaLinter has identified and fixed code style issues.
|
||||
|
||||
### 🔍 Changes Made
|
||||
- Automated code style fixes
|
||||
- Formatting improvements
|
||||
- Lint error corrections
|
||||
|
||||
### 📝 Notes
|
||||
- Please review the changes carefully
|
||||
- Run tests before merging
|
||||
- Verify formatting matches project standards
|
||||
|
||||
> Generated automatically by MegaLinter
|
||||
|
||||
- name: Commit Fixes
|
||||
if: |
|
||||
steps.ml.outputs.has_updated_sources == 1 &&
|
||||
(env.APPLY_FIXES_EVENT == 'all' || env.APPLY_FIXES_EVENT == github.event_name) &&
|
||||
env.APPLY_FIXES_MODE == 'commit' &&
|
||||
github.ref != 'refs/heads/main' &&
|
||||
(github.event_name == 'push' || github.event.pull_request.head.repo.full_name == github.repository) &&
|
||||
!contains(github.event.head_commit.message, 'skip fix')
|
||||
uses: stefanzweifel/git-auto-commit-action@v5
|
||||
with:
|
||||
branch: ${{ github.event.pull_request.head.ref || github.head_ref || github.ref }}
|
||||
commit_message: |
|
||||
style: apply MegaLinter fixes
|
||||
|
||||
[skip ci]
|
||||
commit_user_name: fiximus
|
||||
commit_user_email: github-bot@ivuorinen.net
|
||||
push_options: --force
|
||||
|
||||
- name: Create Status Check
|
||||
if: always()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const status = '${{ steps.check-results.outputs.status }}';
|
||||
const conclusion = status === 'success' ? 'success' : 'failure';
|
||||
|
||||
const summary = `## MegaLinter Results
|
||||
|
||||
${status === 'success' ? '✅ All checks passed' : '❌ Issues found'}
|
||||
|
||||
[View detailed report](${process.env.GITHUB_SERVER_URL}/${process.env.GITHUB_REPOSITORY}/actions/runs/${process.env.GITHUB_RUN_ID})
|
||||
`;
|
||||
|
||||
await core.summary
|
||||
.addRaw(summary)
|
||||
.write();
|
||||
|
||||
if (status !== 'success') {
|
||||
core.setFailed('MegaLinter found issues');
|
||||
}
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
# Remove temporary files but keep reports
|
||||
find . -type f -name "megalinter.*" ! -name "megalinter-reports" -delete || true
|
||||
find . -type d -name ".megalinter" -exec rm -rf {} + || true
|
||||
|
||||
14
.github/workflows/release-drafter.yml
vendored
14
.github/workflows/release-drafter.yml
vendored
@@ -1,14 +0,0 @@
|
||||
---
|
||||
name: Release Drafter
|
||||
|
||||
# yamllint disable-line rule:truthy
|
||||
on:
|
||||
workflow_call:
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
statuses: write
|
||||
|
||||
jobs:
|
||||
Draft:
|
||||
uses: ivuorinen/.github/.github/workflows/sync-labels.yml@main
|
||||
18
.github/workflows/release.yml
vendored
Normal file
18
.github/workflows/release.yml
vendored
Normal file
@@ -0,0 +1,18 @@
|
||||
name: Release
|
||||
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- 'v*'
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
jobs:
|
||||
release:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: softprops/action-gh-release@v2.2.1
|
||||
with:
|
||||
generate_release_notes: true
|
||||
41
.github/workflows/scorecard.yml
vendored
Normal file
41
.github/workflows/scorecard.yml
vendored
Normal file
@@ -0,0 +1,41 @@
|
||||
name: Scorecard
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 2 * * 0'
|
||||
push:
|
||||
branches: [main]
|
||||
|
||||
permissions: read-all
|
||||
|
||||
jobs:
|
||||
analysis:
|
||||
name: Scorecard analysis
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
security-events: write
|
||||
id-token: write
|
||||
|
||||
steps:
|
||||
- name: Checkout code
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
persist-credentials: false
|
||||
|
||||
- name: Run analysis
|
||||
uses: ossf/scorecard-action@v2
|
||||
with:
|
||||
results_file: results.sarif
|
||||
results_format: sarif
|
||||
publish_results: true
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: SARIF file
|
||||
path: results.sarif
|
||||
retention-days: 5
|
||||
|
||||
- name: Upload to code-scanning
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: results.sarif
|
||||
174
.github/workflows/security-metrics.yml
vendored
Normal file
174
.github/workflows/security-metrics.yml
vendored
Normal file
@@ -0,0 +1,174 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Security Metrics Collection
|
||||
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['Security Checks']
|
||||
types:
|
||||
- completed
|
||||
schedule:
|
||||
- cron: '0 0 * * 0' # Weekly
|
||||
|
||||
jobs:
|
||||
collect-metrics:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
- name: Collect Metrics
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const metrics = {
|
||||
timestamp: new Date().toISOString(),
|
||||
weekly: {
|
||||
scans: 0,
|
||||
vulnerabilities: {
|
||||
critical: 0,
|
||||
high: 0,
|
||||
medium: 0,
|
||||
low: 0
|
||||
},
|
||||
fixes: {
|
||||
submitted: 0,
|
||||
merged: 0
|
||||
},
|
||||
meanTimeToFix: null // Initialize as null instead of 0
|
||||
}
|
||||
};
|
||||
|
||||
try {
|
||||
// Collect scan metrics
|
||||
const scans = await github.rest.actions.listWorkflowRuns({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
workflow_id: 'security.yml',
|
||||
created: `>${new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString()}`
|
||||
});
|
||||
|
||||
metrics.weekly.scans = scans.data.total_count;
|
||||
|
||||
// Collect vulnerability metrics
|
||||
const vulnIssues = await github.rest.issues.listForRepo({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
labels: 'security',
|
||||
state: 'all',
|
||||
since: new Date(Date.now() - 7 * 24 * 60 * 60 * 1000).toISOString()
|
||||
});
|
||||
|
||||
// Calculate vulnerability metrics
|
||||
vulnIssues.data.forEach(issue => {
|
||||
if (issue.labels.find(l => l.name === 'critical')) metrics.weekly.vulnerabilities.critical++;
|
||||
if (issue.labels.find(l => l.name === 'high')) metrics.weekly.vulnerabilities.high++;
|
||||
if (issue.labels.find(l => l.name === 'medium')) metrics.weekly.vulnerabilities.medium++;
|
||||
if (issue.labels.find(l => l.name === 'low')) metrics.weekly.vulnerabilities.low++;
|
||||
});
|
||||
|
||||
// Calculate fix metrics
|
||||
const fixPRs = await github.rest.pulls.list({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
state: 'all',
|
||||
labels: 'security-fix'
|
||||
});
|
||||
|
||||
metrics.weekly.fixes.submitted = fixPRs.data.length;
|
||||
const mergedPRs = fixPRs.data.filter(pr => pr.merged);
|
||||
metrics.weekly.fixes.merged = mergedPRs.length;
|
||||
|
||||
// Calculate mean time to fix only if there are merged PRs
|
||||
if (mergedPRs.length > 0) {
|
||||
const fixTimes = mergedPRs.map(pr => {
|
||||
const mergedAt = new Date(pr.merged_at);
|
||||
const createdAt = new Date(pr.created_at);
|
||||
return mergedAt - createdAt;
|
||||
});
|
||||
|
||||
const totalTime = fixTimes.reduce((a, b) => a + b, 0);
|
||||
// Convert to hours and round to 2 decimal places
|
||||
metrics.weekly.meanTimeToFix = Number((totalTime / (fixTimes.length * 3600000)).toFixed(2));
|
||||
}
|
||||
|
||||
// Save metrics
|
||||
const fs = require('fs');
|
||||
fs.writeFileSync('security-metrics.json', JSON.stringify(metrics, null, 2));
|
||||
|
||||
// Generate report
|
||||
const report = generateMetricsReport(metrics);
|
||||
|
||||
// Create/update metrics dashboard
|
||||
await github.rest.issues.create({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
title: '📊 Weekly Security Metrics Report',
|
||||
body: generateReport(metrics),
|
||||
labels: ['metrics', 'security']
|
||||
});
|
||||
|
||||
} catch (error) {
|
||||
core.setFailed(`Failed to collect metrics: ${error.message}`);
|
||||
}
|
||||
|
||||
function generateReport(metrics) {
|
||||
const formatDuration = (hours) => {
|
||||
if (hours === null) return 'N/A';
|
||||
return `${hours} hours`;
|
||||
};
|
||||
|
||||
return `## 📊 Weekly Security Metrics Report
|
||||
|
||||
### Timeline
|
||||
- Report Generated: ${new Date().toISOString()}
|
||||
- Period: Last 7 days
|
||||
|
||||
### Security Scans
|
||||
- Total Scans Run: ${metrics.weekly.scans}
|
||||
|
||||
### Vulnerabilities
|
||||
- Critical: ${metrics.weekly.vulnerabilities.critical}
|
||||
- High: ${metrics.weekly.vulnerabilities.high}
|
||||
- Medium: ${metrics.weekly.vulnerabilities.medium}
|
||||
- Low: ${metrics.weekly.vulnerabilities.low}
|
||||
|
||||
### Fixes
|
||||
- PRs Submitted: ${metrics.weekly.fixes.submitted}
|
||||
- PRs Merged: ${metrics.weekly.fixes.merged}
|
||||
- Mean Time to Fix: ${formatDuration(metrics.weekly.meanTimeToFix)}
|
||||
|
||||
### Summary
|
||||
${generateSummary(metrics)}
|
||||
|
||||
> This report was automatically generated by the security metrics workflow.`;
|
||||
}
|
||||
|
||||
function generateSummary(metrics) {
|
||||
const total = Object.values(metrics.weekly.vulnerabilities).reduce((a, b) => a + b, 0);
|
||||
const fixRate = metrics.weekly.fixes.merged / metrics.weekly.fixes.submitted || 0;
|
||||
|
||||
let summary = [];
|
||||
|
||||
if (total === 0) {
|
||||
summary.push('✅ No vulnerabilities detected this week.');
|
||||
} else {
|
||||
summary.push(`⚠️ Detected ${total} total vulnerabilities.`);
|
||||
if (metrics.weekly.vulnerabilities.critical > 0) {
|
||||
summary.push(`🚨 ${metrics.weekly.vulnerabilities.critical} critical vulnerabilities require immediate attention!`);
|
||||
}
|
||||
}
|
||||
|
||||
if (metrics.weekly.fixes.submitted > 0) {
|
||||
summary.push(`🔧 Fix rate: ${(fixRate * 100).toFixed(1)}%`);
|
||||
}
|
||||
|
||||
if (metrics.weekly.meanTimeToFix !== null) {
|
||||
summary.push(`⏱️ Average time to fix: ${metrics.weekly.meanTimeToFix} hours`);
|
||||
}
|
||||
|
||||
return summary.join('\n');
|
||||
}
|
||||
134
.github/workflows/security-trends.yml
vendored
Normal file
134
.github/workflows/security-trends.yml
vendored
Normal file
@@ -0,0 +1,134 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Security Trends Analysis
|
||||
|
||||
on:
|
||||
workflow_run:
|
||||
workflows: ['Security Checks']
|
||||
types:
|
||||
- completed
|
||||
|
||||
jobs:
|
||||
analyze-trends:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
with:
|
||||
ref: main
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Download latest results
|
||||
uses: actions/download-artifact@v4
|
||||
with:
|
||||
name: security-reports-${{ github.event.workflow_run.id }}
|
||||
path: latest-results
|
||||
|
||||
- name: Analyze Trends
|
||||
id: analyze
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
const path = require('path');
|
||||
|
||||
try {
|
||||
// ... [previous code remains the same until report generation]
|
||||
|
||||
// Generate trend report
|
||||
const report = generateTrendReport(trends);
|
||||
|
||||
// Save report explicitly for next step
|
||||
console.log('Writing trend report to file...');
|
||||
fs.writeFileSync('trend-report.md', report);
|
||||
console.log('Trend report saved successfully');
|
||||
|
||||
// Save history
|
||||
fs.writeFileSync(historyFile, JSON.stringify(history, null, 2));
|
||||
|
||||
// Generate and save chart
|
||||
const chartData = generateChartData(history);
|
||||
fs.writeFileSync('security-trends.svg', chartData);
|
||||
|
||||
// Set outputs for other steps
|
||||
core.setOutput('has_vulnerabilities',
|
||||
trends.critical.current > 0 || trends.high.current > 0);
|
||||
core.setOutput('trend_status',
|
||||
trends.critical.trend > 0 || trends.high.trend > 0 ? 'worsening' : 'improving');
|
||||
core.setOutput('report_path', 'trend-report.md');
|
||||
|
||||
} catch (error) {
|
||||
core.setFailed(`Failed to analyze trends: ${error.message}`);
|
||||
throw error;
|
||||
}
|
||||
|
||||
- name: Verify Report File
|
||||
id: verify
|
||||
shell: bash
|
||||
run: |
|
||||
if [ ! -f "trend-report.md" ]; then
|
||||
echo "::error::Trend report file not found"
|
||||
echo "exists=false" >> "$GITHUB_OUTPUT"
|
||||
exit 1
|
||||
else
|
||||
echo "exists=true" >> "$GITHUB_OUTPUT"
|
||||
echo "size=$(stat -f%z trend-report.md)" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
- name: Create Trend Report Issue
|
||||
if: |
|
||||
github.event.workflow_run.conclusion == 'success' &&
|
||||
steps.verify.outputs.exists == 'true'
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
try {
|
||||
const fs = require('fs');
|
||||
const reportPath = 'trend-report.md';
|
||||
|
||||
console.log('Reading trend report from:', reportPath);
|
||||
console.log('File size:', '${{ steps.verify.outputs.size }}', 'bytes');
|
||||
|
||||
if (!fs.existsSync(reportPath)) {
|
||||
throw new Error('Trend report file not found despite verification');
|
||||
}
|
||||
|
||||
const report = fs.readFileSync(reportPath, 'utf8');
|
||||
if (!report.trim()) {
|
||||
throw new Error('Trend report file is empty');
|
||||
}
|
||||
|
||||
const hasVulnerabilities = '${{ steps.analyze.outputs.has_vulnerabilities }}' === 'true';
|
||||
const trendStatus = '${{ steps.analyze.outputs.trend_status }}';
|
||||
|
||||
const title = `📊 Security Trend Report - ${
|
||||
hasVulnerabilities ?
|
||||
`⚠️ Vulnerabilities ${trendStatus}` :
|
||||
'✅ No vulnerabilities'
|
||||
}`;
|
||||
|
||||
await github.rest.issues.create({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
title: title,
|
||||
body: report,
|
||||
labels: ['security', 'metrics', hasVulnerabilities ? 'attention-required' : 'healthy']
|
||||
});
|
||||
|
||||
console.log('Successfully created trend report issue');
|
||||
|
||||
} catch (error) {
|
||||
console.error('Failed to create trend report issue:', error);
|
||||
core.setFailed(`Failed to create trend report issue: ${error.message}`);
|
||||
}
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
# Remove temporary files but keep the history
|
||||
rm -f trend-report.md security-trends.svg
|
||||
echo "Cleaned up temporary files"
|
||||
487
.github/workflows/security.yml
vendored
Normal file
487
.github/workflows/security.yml
vendored
Normal file
@@ -0,0 +1,487 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Security Checks
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: '0 0 * * *' # Run daily at midnight
|
||||
workflow_dispatch:
|
||||
pull_request:
|
||||
paths:
|
||||
- '**/package.json'
|
||||
- '**/package-lock.json'
|
||||
- '**/yarn.lock'
|
||||
- '**/pnpm-lock.yaml'
|
||||
- '**/requirements.txt'
|
||||
- '**/Dockerfile'
|
||||
- '**/*.py'
|
||||
- '**/*.js'
|
||||
- '**/*.ts'
|
||||
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
env:
|
||||
SLACK_WEBHOOK: ${{ secrets.SLACK_WEBHOOK }}
|
||||
|
||||
jobs:
|
||||
security:
|
||||
name: Security Analysis
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
security-events: write
|
||||
issues: write
|
||||
actions: read
|
||||
pull-requests: write
|
||||
|
||||
steps:
|
||||
- name: Check Required Secrets
|
||||
id: check-secrets
|
||||
shell: bash
|
||||
run: |
|
||||
# Initialize flags
|
||||
{
|
||||
echo "run_snyk=false"
|
||||
echo "run_slack=false"
|
||||
echo "run_sonarcloud=false"
|
||||
} >> "$GITHUB_OUTPUT"
|
||||
|
||||
# Check secrets
|
||||
if [ -n "${{ secrets.SNYK_TOKEN }}" ]; then
|
||||
echo "run_snyk=true" >> "$GITHUB_OUTPUT"
|
||||
echo "Snyk token available"
|
||||
else
|
||||
echo "::warning::SNYK_TOKEN not set - Snyk scans will be skipped"
|
||||
fi
|
||||
|
||||
if [ -n "${{ secrets.SLACK_WEBHOOK }}" ]; then
|
||||
echo "run_slack=true" >> "$GITHUB_OUTPUT"
|
||||
echo "Slack webhook available"
|
||||
else
|
||||
echo "::warning::SLACK_WEBHOOK not set - Slack notifications will be skipped"
|
||||
fi
|
||||
|
||||
if [ -n "${{ secrets.SONAR_TOKEN }}" ]; then
|
||||
echo "run_sonarcloud=true" >> "$GITHUB_OUTPUT"
|
||||
echo "SonarCloud token available"
|
||||
else
|
||||
echo "::warning::SONAR_TOKEN not set - SonarCloud analysis will be skipped"
|
||||
fi
|
||||
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0 # Full history for better analysis
|
||||
|
||||
- name: Run OWASP Dependency Check
|
||||
uses: dependency-check/Dependency-Check_Action@main
|
||||
with:
|
||||
project: 'GitHub Actions'
|
||||
path: '.'
|
||||
format: 'SARIF'
|
||||
out: 'reports'
|
||||
args: >
|
||||
--enableRetired
|
||||
--enableExperimental
|
||||
--failOnCVSS 7
|
||||
--suppress ${{ github.workspace }}/suppressions.xml
|
||||
|
||||
- name: Upload OWASP Results
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: reports/dependency-check-report.sarif
|
||||
category: owasp-dependency-check
|
||||
|
||||
- name: Setup Node.js
|
||||
if: steps.check-secrets.outputs.run_snyk == 'true'
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: 'lts/*'
|
||||
cache: 'npm'
|
||||
|
||||
- name: Run Snyk Scan
|
||||
id: snyk
|
||||
if: steps.check-secrets.outputs.run_snyk == 'true'
|
||||
uses: snyk/actions/node@master
|
||||
continue-on-error: true # Don't fail the workflow, we'll handle results
|
||||
env:
|
||||
SNYK_TOKEN: ${{ secrets.SNYK_TOKEN }}
|
||||
with:
|
||||
args: --all-projects --sarif-file-output=snyk-results.sarif
|
||||
|
||||
- name: Upload Snyk Results
|
||||
if: steps.check-secrets.outputs.run_snyk == 'true'
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: snyk-results.sarif
|
||||
category: snyk
|
||||
|
||||
- name: Analyze Vulnerabilities
|
||||
id: vuln-analysis
|
||||
if: always()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
const exec = require('@actions/exec');
|
||||
|
||||
async function analyzeSarif(filePath, tool) {
|
||||
if (!fs.existsSync(filePath)) return null;
|
||||
|
||||
try {
|
||||
const sarif = JSON.parse(fs.readFileSync(filePath, 'utf8'));
|
||||
let counts = { critical: 0, high: 0, medium: 0, low: 0 };
|
||||
|
||||
sarif.runs.forEach(run => {
|
||||
run.results?.forEach(result => {
|
||||
let severity;
|
||||
if (tool === 'snyk') {
|
||||
severity = result.ruleId.includes('critical') ? 'critical' :
|
||||
result.ruleId.includes('high') ? 'high' :
|
||||
result.ruleId.includes('medium') ? 'medium' : 'low';
|
||||
} else {
|
||||
severity = result.level === 'error' ? 'high' :
|
||||
result.level === 'warning' ? 'medium' : 'low';
|
||||
}
|
||||
counts[severity]++;
|
||||
});
|
||||
});
|
||||
|
||||
return counts;
|
||||
} catch (error) {
|
||||
console.error(`Error analyzing ${tool} results:`, error);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
try {
|
||||
// Analyze results from different tools
|
||||
const results = {
|
||||
owasp: await analyzeSarif('reports/dependency-check-report.sarif', 'owasp'),
|
||||
snyk: ${{ steps.check-secrets.outputs.run_snyk == 'true' }} ?
|
||||
await analyzeSarif('snyk-results.sarif', 'snyk') : null
|
||||
};
|
||||
|
||||
// Calculate totals
|
||||
const summary = {
|
||||
timestamp: new Date().toISOString(),
|
||||
results,
|
||||
total: {
|
||||
critical: Object.values(results).reduce((sum, r) => sum + (r?.critical || 0), 0),
|
||||
high: Object.values(results).reduce((sum, r) => sum + (r?.high || 0), 0),
|
||||
medium: Object.values(results).reduce((sum, r) => sum + (r?.medium || 0), 0),
|
||||
low: Object.values(results).reduce((sum, r) => sum + (r?.low || 0), 0)
|
||||
}
|
||||
};
|
||||
|
||||
// Save summary
|
||||
fs.writeFileSync('vulnerability-summary.json', JSON.stringify(summary, null, 2));
|
||||
|
||||
// Set outputs for other steps
|
||||
core.setOutput('critical_count', summary.total.critical);
|
||||
core.setOutput('high_count', summary.total.high);
|
||||
|
||||
// Create/update status badge
|
||||
const badge = {
|
||||
schemaVersion: 1,
|
||||
label: 'vulnerabilities',
|
||||
message: `critical: ${summary.total.critical} high: ${summary.total.high}`,
|
||||
color: summary.total.critical > 0 ? 'red' :
|
||||
summary.total.high > 0 ? 'orange' : 'green'
|
||||
};
|
||||
|
||||
fs.writeFileSync('security-badge.json', JSON.stringify(badge));
|
||||
|
||||
// Generate markdown report
|
||||
const report = `## Security Scan Results
|
||||
|
||||
### Summary
|
||||
- Critical: ${summary.total.critical}
|
||||
- High: ${summary.total.high}
|
||||
- Medium: ${summary.total.medium}
|
||||
- Low: ${summary.total.low}
|
||||
|
||||
### Tool-specific Results
|
||||
${Object.entries(results)
|
||||
.filter(([_, r]) => r)
|
||||
.map(([tool, r]) => `
|
||||
#### ${tool.toUpperCase()}
|
||||
- Critical: ${r.critical}
|
||||
- High: ${r.high}
|
||||
- Medium: ${r.medium}
|
||||
- Low: ${r.low}
|
||||
`).join('\n')}
|
||||
`;
|
||||
|
||||
fs.writeFileSync('security-report.md', report);
|
||||
|
||||
// Write job summary
|
||||
await core.summary
|
||||
.addRaw(report)
|
||||
.write();
|
||||
|
||||
// Exit with error if critical/high vulnerabilities found
|
||||
if (summary.total.critical > 0 || summary.total.high > 0) {
|
||||
core.setFailed(`Found ${summary.total.critical} critical and ${summary.total.high} high severity vulnerabilities`);
|
||||
}
|
||||
} catch (error) {
|
||||
core.setFailed(`Analysis failed: ${error.message}`);
|
||||
}
|
||||
|
||||
- name: Archive Security Reports
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: security-reports-${{ github.run_id }}
|
||||
path: |
|
||||
reports/
|
||||
snyk-results.sarif
|
||||
vulnerability-summary.json
|
||||
security-report.md
|
||||
security-badge.json
|
||||
retention-days: 30
|
||||
|
||||
- name: Create Fix PRs
|
||||
if: always() && (steps.vuln-analysis.outputs.critical_count > 0 || steps.vuln-analysis.outputs.high_count > 0)
|
||||
uses: actions/github-script@v7
|
||||
continue-on-error: true
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
|
||||
async function createFixPR(vulnerability) {
|
||||
const branchName = `security/fix-${vulnerability.id}`;
|
||||
|
||||
try {
|
||||
// Create branch
|
||||
await exec.exec('git', ['checkout', '-b', branchName]);
|
||||
|
||||
// Apply fixes based on vulnerability type
|
||||
if (vulnerability.tool === 'snyk') {
|
||||
await exec.exec('npx', ['snyk', 'fix']);
|
||||
} else if (vulnerability.tool === 'owasp') {
|
||||
// Update dependencies to fixed versions
|
||||
if (fs.existsSync('package.json')) {
|
||||
await exec.exec('npm', ['audit', 'fix']);
|
||||
}
|
||||
}
|
||||
|
||||
// Check if there are changes
|
||||
const { stdout: status } = await exec.getExecOutput('git', ['status', '--porcelain']);
|
||||
if (!status) {
|
||||
console.log('No changes to commit');
|
||||
return null;
|
||||
}
|
||||
|
||||
// Commit changes
|
||||
await exec.exec('git', ['config', 'user.name', 'fiximus']);
|
||||
await exec.exec('git', ['config', 'user.email', 'github-bot@ivuorinen.net']);
|
||||
await exec.exec('git', ['add', '.']);
|
||||
await exec.exec('git', ['commit', '-m', `fix: ${vulnerability.title}`]);
|
||||
await exec.exec('git', ['push', 'origin', branchName]);
|
||||
|
||||
// Create PR
|
||||
const pr = await github.rest.pulls.create({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
title: `🔒 Security: ${vulnerability.title}`,
|
||||
body: generatePRBody(vulnerability),
|
||||
head: branchName,
|
||||
base: 'main'
|
||||
});
|
||||
|
||||
// Add labels
|
||||
await github.rest.issues.addLabels({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: pr.data.number,
|
||||
labels: ['security-fix', 'automated-pr', 'dependencies']
|
||||
});
|
||||
|
||||
return pr.data.html_url;
|
||||
} catch (error) {
|
||||
console.error(`Failed to create fix PR: ${error.message}`);
|
||||
return null;
|
||||
}
|
||||
}
|
||||
|
||||
function generatePRBody(vulnerability) {
|
||||
return `## Security Fix
|
||||
|
||||
### Vulnerability Details
|
||||
- ID: ${vulnerability.id}
|
||||
- Severity: ${vulnerability.severity}
|
||||
- Tool: ${vulnerability.tool}
|
||||
|
||||
### Changes Made
|
||||
${vulnerability.fixes || 'Dependency updates to resolve security vulnerabilities'}
|
||||
|
||||
### Testing
|
||||
- [ ] Verify fix resolves the vulnerability
|
||||
- [ ] Run security scan to confirm fix
|
||||
- [ ] Test affected functionality
|
||||
|
||||
### Notes
|
||||
- This PR was automatically generated
|
||||
- Please review changes carefully
|
||||
- Additional manual changes may be needed
|
||||
|
||||
> Generated by security workflow`;
|
||||
}
|
||||
|
||||
try {
|
||||
// Process vulnerabilities from both tools
|
||||
const vulnFiles = ['snyk-results.sarif', 'reports/dependency-check-report.sarif'];
|
||||
const fixableVulnerabilities = [];
|
||||
|
||||
for (const file of vulnFiles) {
|
||||
if (fs.existsSync(file)) {
|
||||
const sarif = JSON.parse(fs.readFileSync(file, 'utf8'));
|
||||
const tool = file.includes('snyk') ? 'snyk' : 'owasp';
|
||||
|
||||
sarif.runs.forEach(run => {
|
||||
run.results?.forEach(result => {
|
||||
if (result.level === 'error' || result.level === 'critical') {
|
||||
fixableVulnerabilities.push({
|
||||
id: result.ruleId,
|
||||
title: result.message.text,
|
||||
severity: result.level,
|
||||
tool,
|
||||
fixes: result.fixes
|
||||
});
|
||||
}
|
||||
});
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
// Create PRs for fixable vulnerabilities
|
||||
const prUrls = [];
|
||||
for (const vuln of fixableVulnerabilities) {
|
||||
const prUrl = await createFixPR(vuln);
|
||||
if (prUrl) prUrls.push(prUrl);
|
||||
}
|
||||
|
||||
core.setOutput('fix_prs', prUrls.join('\n'));
|
||||
|
||||
if (prUrls.length > 0) {
|
||||
console.log(`Created ${prUrls.length} fix PRs:`);
|
||||
prUrls.forEach(url => console.log(`- ${url}`));
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to process vulnerabilities:', error);
|
||||
}
|
||||
|
||||
- name: Notify on Failure
|
||||
if: failure()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
|
||||
try {
|
||||
const { repo, owner } = context.repo;
|
||||
const runUrl = `${process.env.GITHUB_SERVER_URL}/${owner}/${repo}/actions/runs/${process.env.GITHUB_RUN_ID}`;
|
||||
|
||||
// Read vulnerability summary if available
|
||||
let vulnSummary = '';
|
||||
if (fs.existsSync('vulnerability-summary.json')) {
|
||||
const summary = JSON.parse(fs.readFileSync('vulnerability-summary.json', 'utf8'));
|
||||
vulnSummary = `
|
||||
### Vulnerability Counts
|
||||
- Critical: ${summary.total.critical}
|
||||
- High: ${summary.total.high}
|
||||
- Medium: ${summary.total.medium}
|
||||
- Low: ${summary.total.low}
|
||||
`;
|
||||
}
|
||||
|
||||
const message = `## 🚨 Security Check Failure
|
||||
|
||||
Security checks have failed in the workflow run.
|
||||
|
||||
### Details
|
||||
- Run: [View Results](${runUrl})
|
||||
- Timestamp: ${new Date().toISOString()}
|
||||
|
||||
${vulnSummary}
|
||||
|
||||
### Reports
|
||||
Security scan reports are available in the workflow artifacts.
|
||||
|
||||
### Next Steps
|
||||
1. Review the security reports
|
||||
2. Address identified vulnerabilities
|
||||
3. Re-run security checks
|
||||
|
||||
> This issue was automatically created by the security workflow.`;
|
||||
|
||||
// Create GitHub issue
|
||||
const issue = await github.rest.issues.create({
|
||||
owner,
|
||||
repo,
|
||||
title: `🚨 Security Check Failure - ${new Date().toISOString().split('T')[0]}`,
|
||||
body: message,
|
||||
labels: ['security', 'automated-issue', 'high-priority'],
|
||||
assignees: ['ivuorinen']
|
||||
});
|
||||
|
||||
// Send Slack notification if configured
|
||||
if (process.env.SLACK_WEBHOOK) {
|
||||
const fetch = require('node-fetch');
|
||||
await fetch(process.env.SLACK_WEBHOOK, {
|
||||
method: 'POST',
|
||||
headers: { 'Content-Type': 'application/json' },
|
||||
body: JSON.stringify({
|
||||
text: `🚨 Security checks failed in ${owner}/${repo}\nDetails: ${issue.data.html_url}`
|
||||
})
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to send notifications:', error);
|
||||
core.setFailed(`Notification failed: ${error.message}`);
|
||||
}
|
||||
|
||||
- name: Cleanup Old Issues
|
||||
if: always()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
try {
|
||||
const { repo, owner } = context.repo;
|
||||
|
||||
const oldIssues = await github.rest.issues.listForRepo({
|
||||
owner,
|
||||
repo,
|
||||
state: 'open',
|
||||
labels: 'automated-issue,security',
|
||||
sort: 'created',
|
||||
direction: 'desc'
|
||||
});
|
||||
|
||||
// Keep only the latest 3 issues
|
||||
const issuesToClose = oldIssues.data.slice(3);
|
||||
|
||||
for (const issue of issuesToClose) {
|
||||
await github.rest.issues.update({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issue.number,
|
||||
state: 'closed',
|
||||
state_reason: 'completed'
|
||||
});
|
||||
|
||||
await github.rest.issues.createComment({
|
||||
owner,
|
||||
repo,
|
||||
issue_number: issue.number,
|
||||
body: '🔒 Auto-closing this issue as newer security check results are available.'
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Failed to cleanup old issues:', error);
|
||||
}
|
||||
50
.github/workflows/stale.yml
vendored
50
.github/workflows/stale.yml
vendored
@@ -1,18 +1,52 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Stale
|
||||
|
||||
on:
|
||||
schedule:
|
||||
- cron: "0 8 * * *"
|
||||
- cron: '0 8 * * *'
|
||||
workflow_call:
|
||||
workflow_dispatch:
|
||||
|
||||
jobs:
|
||||
stale:
|
||||
permissions:
|
||||
contents: write
|
||||
issues: write
|
||||
pull-requests: write
|
||||
permissions:
|
||||
contents: read
|
||||
packages: read
|
||||
statuses: read
|
||||
uses: ivuorinen/.github/.github/workflows/stale.yml@main
|
||||
|
||||
jobs:
|
||||
stale:
|
||||
name: 🧹 Clean up stale issues and PRs
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
permissions:
|
||||
contents: write # only for delete-branch option
|
||||
issues: write
|
||||
pull-requests: write
|
||||
|
||||
steps:
|
||||
- name: 🚀 Run stale
|
||||
uses: actions/stale@v9.1.0
|
||||
with:
|
||||
repo-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
days-before-stale: 30
|
||||
days-before-close: 7
|
||||
remove-stale-when-updated: true
|
||||
stale-issue-label: 'stale'
|
||||
exempt-issue-labels: 'no-stale,help-wanted'
|
||||
stale-issue-message: >
|
||||
There hasn't been any activity on this issue recently, so we
|
||||
clean up some of the older and inactive issues.
|
||||
|
||||
Please make sure to update to the latest version and
|
||||
check if that solves the issue. Let us know if that works for you
|
||||
by leaving a comment 👍
|
||||
|
||||
This issue has now been marked as stale and will be closed if no
|
||||
further activity occurs. Thanks!
|
||||
stale-pr-label: 'stale'
|
||||
exempt-pr-labels: 'no-stale'
|
||||
stale-pr-message: >
|
||||
There hasn't been any activity on this pull request recently. This
|
||||
pull request has been automatically marked as stale because of that
|
||||
and will be closed if no further activity occurs within 7 days.
|
||||
Thank you for your contributions.
|
||||
|
||||
250
.github/workflows/sync-labels.yml
vendored
250
.github/workflows/sync-labels.yml
vendored
@@ -1,21 +1,255 @@
|
||||
---
|
||||
name: Sync labels
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: Sync Labels
|
||||
|
||||
# yamllint disable-line rule:truthy
|
||||
on:
|
||||
push:
|
||||
branches:
|
||||
- main
|
||||
- master
|
||||
paths:
|
||||
- .github/labels.yml
|
||||
- '.github/labels.yml'
|
||||
- '.github/workflows/sync-labels.yml'
|
||||
schedule:
|
||||
- cron: "34 5 * * *"
|
||||
- cron: '34 5 * * *' # 5:34 AM UTC every day
|
||||
workflow_call:
|
||||
workflow_dispatch:
|
||||
|
||||
permissions:
|
||||
issues: write
|
||||
concurrency:
|
||||
group: ${{ github.workflow }}-${{ github.ref }}
|
||||
cancel-in-progress: true
|
||||
|
||||
permissions: read-all
|
||||
|
||||
jobs:
|
||||
SyncLabels:
|
||||
uses: ivuorinen/.github/.github/workflows/sync-labels.yml@main
|
||||
labels:
|
||||
name: ♻️ Sync Labels
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 10
|
||||
|
||||
permissions:
|
||||
contents: read
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 1
|
||||
|
||||
- name: Validate Labels File
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
LABELS_URL="https://raw.githubusercontent.com/ivuorinen/actions/refs/heads/main/sync-labels/labels.yml"
|
||||
LABELS_FILE="labels.yml"
|
||||
|
||||
echo "Downloading labels file..."
|
||||
if ! curl -s --retry 3 --retry-delay 5 -o "$LABELS_FILE" "$LABELS_URL"; then
|
||||
echo "::error::Failed to download labels file"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Validating YAML format..."
|
||||
if ! yq eval "$LABELS_FILE" > /dev/null 2>&1; then
|
||||
echo "::error::Invalid YAML format in labels file"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for required fields in each label
|
||||
echo "Validating label definitions..."
|
||||
INVALID=0
|
||||
HAS_NAME=0
|
||||
HAS_COLOR=0
|
||||
HAS_DESCRIPTION=0
|
||||
CURRENT_LABEL=""
|
||||
|
||||
check_label_completion() {
|
||||
if [[ $HAS_NAME -eq 1 && $HAS_COLOR -eq 1 && $HAS_DESCRIPTION -eq 1 ]]; then
|
||||
echo "✓ Valid label: $CURRENT_LABEL"
|
||||
else
|
||||
echo "✗ Invalid label: $CURRENT_LABEL (missing:"
|
||||
[[ $HAS_NAME -eq 0 ]] && echo " - name"
|
||||
[[ $HAS_COLOR -eq 0 ]] && echo " - color"
|
||||
[[ $HAS_DESCRIPTION -eq 0 ]] && echo " - description"
|
||||
echo ")"
|
||||
INVALID=$((INVALID + 1))
|
||||
fi
|
||||
}
|
||||
|
||||
while IFS= read -r line || [ -n "$line" ]; do
|
||||
# Skip empty lines and comments
|
||||
[[ -z "$line" || "$line" =~ ^[[:space:]]*# ]] && continue
|
||||
|
||||
if [[ "$line" =~ ^-.*$ ]]; then
|
||||
# Check previous label completion before starting new one
|
||||
if [[ -n "$CURRENT_LABEL" ]]; then
|
||||
check_label_completion
|
||||
fi
|
||||
# New label definition found, reset checks
|
||||
HAS_NAME=0
|
||||
HAS_COLOR=0
|
||||
HAS_DESCRIPTION=0
|
||||
CURRENT_LABEL="(new label)"
|
||||
elif [[ "$line" =~ ^[[:space:]]+name:[[:space:]]*(.+)$ ]]; then
|
||||
HAS_NAME=1
|
||||
CURRENT_LABEL="${BASH_REMATCH[1]}"
|
||||
elif [[ "$line" =~ ^[[:space:]]+color:[[:space:]]*([0-9A-Fa-f]{6})$ ]]; then
|
||||
HAS_COLOR=1
|
||||
elif [[ "$line" =~ ^[[:space:]]+description:[[:space:]]+.+$ ]]; then
|
||||
HAS_DESCRIPTION=1
|
||||
fi
|
||||
done < "$LABELS_FILE"
|
||||
|
||||
# Check the last label
|
||||
if [[ -n "$CURRENT_LABEL" ]]; then
|
||||
check_label_completion
|
||||
fi
|
||||
|
||||
echo "Validation complete. Found $INVALID invalid label(s)."
|
||||
|
||||
if [ $INVALID -ne 0 ]; then
|
||||
echo "::error::Found $INVALID invalid label definition(s)"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Labels file validated successfully"
|
||||
echo "labels_file=$LABELS_FILE" >> "$GITHUB_OUTPUT"
|
||||
|
||||
# Create validation summary
|
||||
{
|
||||
echo "## Label Validation Results"
|
||||
echo
|
||||
echo "- Total invalid labels: $INVALID"
|
||||
echo "- File: \`$LABELS_FILE\`"
|
||||
echo "- Timestamp: $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
|
||||
echo
|
||||
echo "All labels have required fields:"
|
||||
echo "- name"
|
||||
echo "- color (6-digit hex)"
|
||||
echo "- description"
|
||||
} > validation-report.md
|
||||
|
||||
- name: Run Label Syncer
|
||||
id: sync
|
||||
uses: micnncim/action-label-syncer@3abd5e6e7981d5a790c6f8a7494374bd8c74b9c6
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
with:
|
||||
manifest: ${{ steps.validate.outputs.labels_file }}
|
||||
prune: true
|
||||
|
||||
- name: Verify Label Sync
|
||||
id: verify
|
||||
if: success()
|
||||
shell: bash
|
||||
run: |
|
||||
echo "Verifying labels..."
|
||||
|
||||
# Get current labels from GitHub
|
||||
CURRENT_LABELS=$(gh api repos/${{ github.repository }}/labels --jq '.[].name')
|
||||
|
||||
# Get expected labels from file
|
||||
EXPECTED_LABELS=$(yq eval '.[] | .name' "${{ steps.validate.outputs.labels_file }}")
|
||||
|
||||
# Compare labels
|
||||
MISSING_LABELS=0
|
||||
while IFS= read -r label; do
|
||||
if ! echo "$CURRENT_LABELS" | grep -q "^${label}$"; then
|
||||
echo "::warning::Label not synced: $label"
|
||||
MISSING_LABELS=1
|
||||
fi
|
||||
done <<< "$EXPECTED_LABELS"
|
||||
|
||||
if [ $MISSING_LABELS -eq 1 ]; then
|
||||
echo "::error::Some labels failed to sync"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "All labels verified successfully"
|
||||
|
||||
- name: Generate Label Report
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
{
|
||||
echo "# Label Sync Report"
|
||||
echo "## Current Labels"
|
||||
echo
|
||||
# shellcheck disable=SC2016
|
||||
gh api repos/${{ github.repository }}/labels --jq '.[] | "- **" + .name + "** (`#" + .color + "`): " + .description' | sort
|
||||
echo
|
||||
echo "## Sync Status"
|
||||
echo "- ✅ Success: ${{ steps.sync.outcome == 'success' }}"
|
||||
echo "- 🔍 Verified: ${{ steps.verify.outcome == 'success' }}"
|
||||
echo
|
||||
echo "Generated at: $(date -u '+%Y-%m-%d %H:%M:%S UTC')"
|
||||
} > label-report.md
|
||||
|
||||
- name: Upload Label Report
|
||||
if: always()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: label-sync-report
|
||||
path: label-report.md
|
||||
retention-days: 7
|
||||
|
||||
- name: Notify on Failure
|
||||
if: failure()
|
||||
uses: actions/github-script@v7
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
|
||||
try {
|
||||
const { repo, owner } = context.repo;
|
||||
const runUrl = `${process.env.GITHUB_SERVER_URL}/${owner}/${repo}/actions/runs/${process.env.GITHUB_RUN_ID}`;
|
||||
|
||||
const body = `## ⚠️ Label Sync Failed
|
||||
|
||||
The label synchronization workflow has failed.
|
||||
|
||||
### Details
|
||||
- Workflow: [View Run](${runUrl})
|
||||
- Repository: ${owner}/${repo}
|
||||
- Timestamp: ${new Date().toISOString()}
|
||||
|
||||
### Status
|
||||
- Validation: ${{ steps.validate.outcome }}
|
||||
- Sync: ${{ steps.sync.outcome }}
|
||||
- Verification: ${{ steps.verify.outcome }}
|
||||
|
||||
Please check the workflow logs for more details.
|
||||
|
||||
> This issue was automatically created by the label sync workflow.`;
|
||||
|
||||
await github.rest.issues.create({
|
||||
owner,
|
||||
repo,
|
||||
title: '⚠️ Label Sync Failure',
|
||||
body,
|
||||
labels: ['automation', 'bug'],
|
||||
assignees: ['ivuorinen'],
|
||||
}).catch(error => {
|
||||
if (error.status === 403) {
|
||||
console.error('Permission denied while creating issue. Please check workflow permissions.');
|
||||
} else if (error.status === 429) {
|
||||
console.error('Rate limit exceeded. Please try again later.');
|
||||
} else {
|
||||
console.error(`Failed to create issue: ${error.message}`);
|
||||
}
|
||||
throw error;
|
||||
});
|
||||
} catch (error) {
|
||||
console.error('Failed to create issue:', error);
|
||||
}
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
# Remove temporary files
|
||||
rm -f labels.yml
|
||||
|
||||
# Remove lock files if they exist
|
||||
find . -name ".lock" -type f -delete
|
||||
|
||||
2
.gitignore
vendored
2
.gitignore
vendored
@@ -132,3 +132,5 @@ Homestead.json
|
||||
.env*.local
|
||||
.vercel
|
||||
next-env.d.ts
|
||||
|
||||
megalinter-reports/
|
||||
|
||||
21
.gitleaks.toml
Normal file
21
.gitleaks.toml
Normal file
@@ -0,0 +1,21 @@
|
||||
[allowlist]
|
||||
description = "Allowlisted files"
|
||||
paths = [
|
||||
'''node_modules''',
|
||||
'''.git''',
|
||||
'''dist''',
|
||||
'''yarn.lock''',
|
||||
'''package-lock.json''',
|
||||
'''pnpm-lock.yaml'''
|
||||
]
|
||||
|
||||
[rules]
|
||||
[rules.github-token]
|
||||
description = "GitHub Token"
|
||||
regex = '''ghp_[0-9a-zA-Z]{36}'''
|
||||
tags = ["token", "github"]
|
||||
|
||||
[rules.secrets]
|
||||
description = "Generic Secret Pattern"
|
||||
regex = '''(?i)(secret|token|key|password|cert)[\s]*[=:]\s*['"][^'"]*['"]'''
|
||||
tags = ["key", "secret"]
|
||||
13
.markdownlint.json
Normal file
13
.markdownlint.json
Normal file
@@ -0,0 +1,13 @@
|
||||
{
|
||||
"default": true,
|
||||
"MD013": {
|
||||
"line_length": 120,
|
||||
"code_blocks": false,
|
||||
"tables": false
|
||||
},
|
||||
"MD024": {
|
||||
"siblings_only": true
|
||||
},
|
||||
"MD033": false,
|
||||
"MD041": false
|
||||
}
|
||||
35
.mega-linter.yml
Normal file
35
.mega-linter.yml
Normal file
@@ -0,0 +1,35 @@
|
||||
---
|
||||
# Configuration file for MegaLinter
|
||||
# See all available variables at
|
||||
# https://megalinter.io/configuration/ and in linters documentation
|
||||
|
||||
APPLY_FIXES: all
|
||||
SHOW_ELAPSED_TIME: false # Show elapsed time at the end of MegaLinter run
|
||||
PARALLEL: true
|
||||
VALIDATE_ALL_CODEBASE: true
|
||||
FILEIO_REPORTER: false # Generate file.io report
|
||||
GITHUB_STATUS_REPORTER: true # Generate GitHub status report
|
||||
IGNORE_GENERATED_FILES: true # Ignore generated files
|
||||
JAVASCRIPT_DEFAULT_STYLE: prettier # Default style for JavaScript
|
||||
PRINT_ALPACA: false # Print Alpaca logo in console
|
||||
SARIF_REPORTER: true # Generate SARIF report
|
||||
SHOW_SKIPPED_LINTERS: false # Show skipped linters in MegaLinter log
|
||||
|
||||
DISABLE_LINTERS:
|
||||
- REPOSITORY_DEVSKIM
|
||||
|
||||
ENABLE_LINTERS:
|
||||
- YAML_YAMLLINT
|
||||
- MARKDOWN_MARKDOWNLINT
|
||||
- YAML_PRETTIER
|
||||
- JSON_PRETTIER
|
||||
- JAVASCRIPT_ES
|
||||
- TYPESCRIPT_ES
|
||||
|
||||
YAML_YAMLLINT_CONFIG_FILE: .yamllint.yml
|
||||
MARKDOWN_MARKDOWNLINT_CONFIG_FILE: .markdownlint.json
|
||||
JAVASCRIPT_ES_CONFIG_FILE: .eslintrc.json
|
||||
TYPESCRIPT_ES_CONFIG_FILE: .eslintrc.json
|
||||
|
||||
FILTER_REGEX_EXCLUDE: >
|
||||
(node_modules|\.automation/test|docs/json-schemas|\.github/workflows)
|
||||
@@ -22,38 +22,40 @@ repos:
|
||||
args: [--autofix, --no-sort-keys]
|
||||
|
||||
- repo: https://github.com/igorshubovych/markdownlint-cli
|
||||
rev: v0.43.0
|
||||
rev: v0.44.0
|
||||
hooks:
|
||||
- id: markdownlint
|
||||
args: [-c, .markdownlint.yaml, --fix]
|
||||
args: [-c, .markdownlint.json, --fix]
|
||||
|
||||
- repo: https://github.com/adrienverge/yamllint
|
||||
rev: v1.35.1
|
||||
hooks:
|
||||
- id: yamllint
|
||||
|
||||
- repo: https://github.com/koalaman/shellcheck-precommit
|
||||
rev: v0.10.0
|
||||
hooks:
|
||||
- id: shellcheck
|
||||
|
||||
- repo: https://github.com/scop/pre-commit-shfmt
|
||||
rev: v3.10.0-2
|
||||
hooks:
|
||||
- id: shfmt
|
||||
|
||||
- repo: https://github.com/koalaman/shellcheck-precommit
|
||||
rev: v0.10.0
|
||||
hooks:
|
||||
- id: shellcheck
|
||||
args: ['--severity=warning']
|
||||
|
||||
- repo: https://github.com/rhysd/actionlint
|
||||
rev: v1.7.7
|
||||
hooks:
|
||||
- id: actionlint
|
||||
args: ['-shellcheck=']
|
||||
|
||||
- repo: https://github.com/renovatebot/pre-commit-hooks
|
||||
rev: 39.122.0
|
||||
rev: 39.156.0
|
||||
hooks:
|
||||
- id: renovate-config-validator
|
||||
|
||||
- repo: https://github.com/bridgecrewio/checkov.git
|
||||
rev: '3.2.354'
|
||||
rev: '3.2.360'
|
||||
hooks:
|
||||
- id: checkov
|
||||
args:
|
||||
|
||||
9
.prettierrc.yml
Normal file
9
.prettierrc.yml
Normal file
@@ -0,0 +1,9 @@
|
||||
---
|
||||
printWidth: 120
|
||||
tabWidth: 2
|
||||
useTabs: false
|
||||
semi: true
|
||||
singleQuote: true
|
||||
trailingComma: 'es5'
|
||||
bracketSpacing: true
|
||||
arrowParens: 'avoid'
|
||||
1
.shellcheckrc
Normal file
1
.shellcheckrc
Normal file
@@ -0,0 +1 @@
|
||||
disable=SC2129
|
||||
11
.yamllint.yml
Normal file
11
.yamllint.yml
Normal file
@@ -0,0 +1,11 @@
|
||||
---
|
||||
extends: default
|
||||
|
||||
rules:
|
||||
line-length:
|
||||
max: 120
|
||||
level: warning
|
||||
truthy:
|
||||
check-keys: false
|
||||
comments:
|
||||
min-spaces-from-content: 1
|
||||
175
README.md
175
README.md
@@ -1,123 +1,88 @@
|
||||
# ivuorinen/actions - My Reusable GitHub Actions and Workflows
|
||||
|
||||
This repository contains reusable GitHub Actions and Workflows that
|
||||
I have created for my own use. Feel free to use them in your own projects.
|
||||
## Overview
|
||||
|
||||
## Actions
|
||||
This project contains a collection of workflows and composable actions to streamline CI/CD
|
||||
processes and ensure code quality. Below is a categorized list of all workflows, grouped by their types.
|
||||
|
||||
These actions are composable and can be used together to create more complex workflows.
|
||||
## Testing Workflows
|
||||
|
||||
### `ivuorinen/actions/php-composer`
|
||||
- [PHP Tests][php-tests]: Runs PHPUnit tests to ensure PHP code correctness.
|
||||
|
||||
This action sets up PHP with specified version and installs Composer dependencies.
|
||||
## Linting and Formatting Workflows
|
||||
|
||||
#### Inputs
|
||||
- [Ansible Lint and Fix][ansible-lint-fix]: Lints and fixes Ansible playbooks and roles.
|
||||
- [Biome Check][biome-check]: Runs Biome to lint multiple languages and formats.
|
||||
- [Biome Fix][biome-fix]: Automatically fixes issues detected by Biome.
|
||||
- [C# Lint Check][csharp-lint-check]: Lints C# code using tools like `dotnet-format`.
|
||||
- [ESLint Check][eslint-check]: Runs ESLint to check for code style violations.
|
||||
- [ESLint Fix][eslint-fix]: Automatically fixes code style issues with ESLint.
|
||||
- [Go Lint Check][go-lint]: Lints Go code using `golangci-lint`.
|
||||
- [Prettier Check][prettier-check]: Checks code formatting using Prettier.
|
||||
- [Prettier Fix][prettier-fix]: Automatically fixes code formatting with Prettier.
|
||||
- [Python Lint and Fix][python-lint-fix]: Lints and fixes Python code using `flake8` and `black`.
|
||||
- [Terraform Lint and Fix][terraform-lint-fix]: Lints and fixes Terraform
|
||||
configurations.
|
||||
|
||||
- `php`: PHP version to use (default: `8.3`)
|
||||
- `args`: Additional arguments to pass to Composer
|
||||
## Build Workflows
|
||||
|
||||
#### Example
|
||||
- [C# Build][csharp-build]: Builds C# projects using the .NET SDK.
|
||||
- [Docker Build][docker-build]: Builds Docker images using a Dockerfile.
|
||||
- [Go Build][go-build]: Builds Go projects using the `go build` command.
|
||||
|
||||
```yaml
|
||||
on:
|
||||
workflow_dispatch:
|
||||
workflow_call:
|
||||
pull_request:
|
||||
paths:
|
||||
- 'composer.json'
|
||||
- 'composer.lock'
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: ivuorinen/actions/php-composer@main
|
||||
with:
|
||||
php: '8.3'
|
||||
args: '--no-dev'
|
||||
```
|
||||
## Deployment Workflows
|
||||
|
||||
### `ivuorinen/actions/set-git-config`
|
||||
- [C# Publish][csharp-publish]: Publishes .NET projects to an output directory.
|
||||
- [Docker Publish to Docker Hub][docker-publish-hub]: Publishes Docker images to Docker Hub.
|
||||
- [Docker Publish to GitHub Packages][docker-publish-gh]: Publishes Docker images to GitHub's Container Registry.
|
||||
- [Publish to NPM][npm-publish]: Publishes packages to the NPM registry.
|
||||
|
||||
This action sets up Git configuration for the repository.
|
||||
## Release Workflows
|
||||
|
||||
#### Inputs
|
||||
- [GitHub Release][github-release]: Automates GitHub release creation with custom tags and notes.
|
||||
- [Release Monthly][release-monthly]: Creates a monthly GitHub release with autogenerated notes.
|
||||
|
||||
- `name`: Name to use for Git commits (default: `GitHub Actions`)
|
||||
- `email`: Email to use for Git commits (default: `github-actions@github.com`)
|
||||
- `token`: GitHub token to use for Git commits (default: `${{ github.token }}`)
|
||||
## Utility Workflows
|
||||
|
||||
#### Example
|
||||
|
||||
```yaml
|
||||
on:
|
||||
workflow_dispatch:
|
||||
workflow_call:
|
||||
pull_request:
|
||||
paths:
|
||||
- '.gitignore'
|
||||
jobs:
|
||||
build:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v4
|
||||
- uses: ivuorinen/actions/set-git-config@main
|
||||
with:
|
||||
name: 'GitHub Actions'
|
||||
email: 'github-actions@github.com'
|
||||
token: ${{ secrets.GITHUB_TOKEN }}
|
||||
```
|
||||
|
||||
## Workflows
|
||||
|
||||
These workflows are complete examples that can be used as-is or as a starting point for your own workflows.
|
||||
|
||||
### `ivuorinen/actions/compress-images`
|
||||
|
||||
This workflow compresses images in a repository using [calibreapp/image-actions](https://github.com/calibreapp/image-actions).
|
||||
Defined in the action is a cron job that runs At 23:00 on Sunday and if there are any changes in the repository it creates a pull request with the compressed images.
|
||||
|
||||
#### Example
|
||||
|
||||
```yaml
|
||||
# .github/workflows/compress-images.yml
|
||||
jobs:
|
||||
compress-images:
|
||||
uses: ivuorinen/actions/compress-images@main
|
||||
```
|
||||
|
||||
### `ivuorinen/actions/release-monthly`
|
||||
|
||||
This workflow creates a monthly release with the current date as the tag name.
|
||||
|
||||
#### Example
|
||||
|
||||
```yaml
|
||||
# .github/workflows/release-monthly.yml
|
||||
jobs:
|
||||
release-monthly:
|
||||
uses: ivuorinen/actions/release-monthly@main
|
||||
```
|
||||
|
||||
### `ivuorinen/actions/php-laravel-phpunit`
|
||||
|
||||
This workflow sets up PHP with Composer and runs PHPUnit tests for a Laravel project.
|
||||
|
||||
#### Example
|
||||
|
||||
```yaml
|
||||
# .github/workflows/php-laravel-phpunit.yml
|
||||
jobs:
|
||||
laravel:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v2
|
||||
- uses: ivuorinen/actions/php-composer@main
|
||||
with:
|
||||
php: '8.3'
|
||||
args: '--no-dev'
|
||||
```
|
||||
- [Common File Check][common-file-check]: Checks for the presence of specific files based on a glob pattern.
|
||||
- [Compress Images][compress-images]: Optimizes and creates a pull request with compressed images.
|
||||
- [Dotnet Version Detect][dotnet-v-detect]: Detects the required .NET version from `global.json`.
|
||||
- [Go Version Detect][go-version-detect]: Detects the required Go version from configuration files.
|
||||
- [Node Setup][node-setup]: Sets up a Node.js environment for workflows.
|
||||
- [PHP Composer][php-composer]: Installs PHP dependencies using Composer.
|
||||
- [Pre-Commit][pre-commit]: Runs `pre-commit` hooks to enforce code quality standards.
|
||||
- [Set Git Config][set-git-config]: Configures Git user information for automated commits.
|
||||
|
||||
## License
|
||||
|
||||
The code in this repository is licensed under the MIT License. See the [LICENSE.md](LICENSE.md) file for details.
|
||||
This project is licensed under the MIT License. See the [LICENSE](LICENSE.md) file for details.
|
||||
|
||||
[ansible-lint-fix]: ansible-lint-fix/README.md
|
||||
[biome-check]: biome-check/README.md
|
||||
[biome-fix]: biome-fix/README.md
|
||||
[common-file-check]: common-file-check/README.md
|
||||
[compress-images]: compress-images/README.md
|
||||
[csharp-build]: csharp-build/README.md
|
||||
[csharp-lint-check]: csharp-lint-check/README.md
|
||||
[csharp-publish]: csharp-publish/README.md
|
||||
[docker-build]: docker-build/README.md
|
||||
[docker-publish-gh]: docker-publish-gh/README.md
|
||||
[docker-publish-hub]: docker-publish-hub/README.md
|
||||
[dotnet-v-detect]: dotnet-version-detect/README.md
|
||||
[eslint-check]: eslint-check/README.md
|
||||
[eslint-fix]: eslint-fix/README.md
|
||||
[github-release]: github-release/README.md
|
||||
[go-build]: go-build/README.md
|
||||
[go-lint]: go-lint/README.md
|
||||
[go-version-detect]: go-version-detect/README.md
|
||||
[node-setup]: node-setup/README.md
|
||||
[npm-publish]: npm-publish/README.md
|
||||
[php-composer]: php-composer/README.md
|
||||
[php-tests]: php-tests/README.md
|
||||
[pre-commit]: pre-commit/README.md
|
||||
[prettier-check]: prettier-check/README.md
|
||||
[prettier-fix]: prettier-fix/README.md
|
||||
[python-lint-fix]: python-lint-fix/README.md
|
||||
[release-monthly]: release-monthly/README.md
|
||||
[set-git-config]: set-git-config/README.md
|
||||
[terraform-lint-fix]: terraform-lint-fix/README.md
|
||||
|
||||
17
ansible-lint-fix/README.md
Normal file
17
ansible-lint-fix/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/ansible-lint-fix
|
||||
|
||||
## Ansible Lint and Fix
|
||||
|
||||
### Description
|
||||
|
||||
Lints and fixes Ansible playbooks, commits changes, and uploads SARIF report.
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/ansible-lint-fix@main
|
||||
```
|
||||
52
ansible-lint-fix/action.yml
Normal file
52
ansible-lint-fix/action.yml
Normal file
@@ -0,0 +1,52 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Ansible Lint and Fix
|
||||
description: 'Lints and fixes Ansible playbooks, commits changes, and uploads SARIF report.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'play'
|
||||
color: 'green'
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Check for Ansible Files
|
||||
shell: bash
|
||||
run: |
|
||||
if ! find . -name "*.yml" | grep -q .; then
|
||||
echo "No Ansible files found. Skipping lint and fix."
|
||||
exit 0
|
||||
fi
|
||||
|
||||
- name: Install ansible-lint
|
||||
shell: bash
|
||||
run: |
|
||||
pip install ansible-lint==6.22.1 || {
|
||||
echo "::error::Failed to install ansible-lint"
|
||||
exit 1
|
||||
}
|
||||
|
||||
- name: Run ansible-lint
|
||||
shell: bash
|
||||
run: |
|
||||
ansible-lint --write --parseable-severity --format sarif > ansible-lint.sarif
|
||||
|
||||
- name: Set Git Config for Fixes
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Commit Fixes
|
||||
shell: bash
|
||||
run: |
|
||||
if git diff --quiet; then
|
||||
echo "No changes to commit."
|
||||
else
|
||||
git add .
|
||||
git commit -m "fix: applied ansible lint fixes"
|
||||
git push
|
||||
fi
|
||||
|
||||
- name: Upload SARIF Report
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: ansible-lint.sarif
|
||||
17
biome-check/README.md
Normal file
17
biome-check/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/biome-check
|
||||
|
||||
## Biome Check
|
||||
|
||||
### Description
|
||||
|
||||
Run Biome check on the repository
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/biome-check@main
|
||||
```
|
||||
36
biome-check/action.yml
Normal file
36
biome-check/action.yml
Normal file
@@ -0,0 +1,36 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Biome Check
|
||||
description: Run Biome check on the repository
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: check-circle
|
||||
color: green
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set Git Config
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Node Setup
|
||||
uses: ivuorinen/actions/node-setup@main
|
||||
|
||||
- name: Install Dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
npm install -g biome
|
||||
|
||||
- name: Run Biome Check
|
||||
shell: bash
|
||||
run: |
|
||||
biome check . --json > biome-report.json
|
||||
|
||||
- name: Upload Biome Results
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: biome-report.json
|
||||
17
biome-fix/README.md
Normal file
17
biome-fix/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/biome-fix
|
||||
|
||||
## Biome Fix
|
||||
|
||||
### Description
|
||||
|
||||
Run Biome fix on the repository
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/biome-fix@main
|
||||
```
|
||||
38
biome-fix/action.yml
Normal file
38
biome-fix/action.yml
Normal file
@@ -0,0 +1,38 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Biome Fix
|
||||
description: Run Biome fix on the repository
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: check-circle
|
||||
color: green
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set Git Config
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Node Setup
|
||||
uses: ivuorinen/actions/node-setup@main
|
||||
|
||||
- name: Install Dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
npm install -g biome
|
||||
|
||||
- name: Run Biome Fix
|
||||
shell: bash
|
||||
run: |
|
||||
biome fix .
|
||||
|
||||
- name: Push Fixes
|
||||
if: success()
|
||||
uses: stefanzweifel/git-auto-commit-action@v5
|
||||
with:
|
||||
commit_message: 'style: autofix Biome violations'
|
||||
add_options: '-u'
|
||||
72
common-cache/README.md
Normal file
72
common-cache/README.md
Normal file
@@ -0,0 +1,72 @@
|
||||
# ivuorinen/actions/common-cache
|
||||
|
||||
## Common Cache
|
||||
|
||||
### Description
|
||||
|
||||
Standardized caching strategy for all actions
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| -------------- | ---------------------------------------------------- | -------- | ------- |
|
||||
| `type` | <p>Type of cache (npm, composer, go, pip, etc.)</p> | `true` | `""` |
|
||||
| `paths` | <p>Paths to cache (comma-separated)</p> | `true` | `""` |
|
||||
| `key-prefix` | <p>Custom prefix for cache key</p> | `false` | `""` |
|
||||
| `key-files` | <p>Files to hash for cache key (comma-separated)</p> | `false` | `""` |
|
||||
| `restore-keys` | <p>Fallback keys for cache restoration</p> | `false` | `""` |
|
||||
| `env-vars` | <p>Environment variables to include in cache key</p> | `false` | `""` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------- | --------------------------- |
|
||||
| `cache-hit` | <p>Cache hit indicator</p> |
|
||||
| `cache-key` | <p>Generated cache key</p> |
|
||||
| `cache-paths` | <p>Resolved cache paths</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/common-cache@main
|
||||
with:
|
||||
type:
|
||||
# Type of cache (npm, composer, go, pip, etc.)
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
paths:
|
||||
# Paths to cache (comma-separated)
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
key-prefix:
|
||||
# Custom prefix for cache key
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
key-files:
|
||||
# Files to hash for cache key (comma-separated)
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
restore-keys:
|
||||
# Fallback keys for cache restoration
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
env-vars:
|
||||
# Environment variables to include in cache key
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
```
|
||||
102
common-cache/action.yml
Normal file
102
common-cache/action.yml
Normal file
@@ -0,0 +1,102 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Common Cache
|
||||
description: 'Standardized caching strategy for all actions'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: database
|
||||
color: gray-dark
|
||||
|
||||
inputs:
|
||||
type:
|
||||
description: 'Type of cache (npm, composer, go, pip, etc.)'
|
||||
required: true
|
||||
paths:
|
||||
description: 'Paths to cache (comma-separated)'
|
||||
required: true
|
||||
key-prefix:
|
||||
description: 'Custom prefix for cache key'
|
||||
required: false
|
||||
default: ''
|
||||
key-files:
|
||||
description: 'Files to hash for cache key (comma-separated)'
|
||||
required: false
|
||||
default: ''
|
||||
restore-keys:
|
||||
description: 'Fallback keys for cache restoration'
|
||||
required: false
|
||||
default: ''
|
||||
env-vars:
|
||||
description: 'Environment variables to include in cache key'
|
||||
required: false
|
||||
default: ''
|
||||
|
||||
outputs:
|
||||
cache-hit:
|
||||
description: 'Cache hit indicator'
|
||||
value: ${{ steps.cache.outputs.cache-hit }}
|
||||
cache-key:
|
||||
description: 'Generated cache key'
|
||||
value: ${{ steps.prepare.outputs.cache-key }}
|
||||
cache-paths:
|
||||
description: 'Resolved cache paths'
|
||||
value: ${{ steps.prepare.outputs.cache-paths }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- id: prepare
|
||||
shell: bash
|
||||
run: |
|
||||
# Generate standardized cache key components
|
||||
os_key="${{ runner.os }}"
|
||||
type_key="${{ inputs.type }}"
|
||||
prefix_key="${{ inputs.key-prefix }}"
|
||||
|
||||
# Process file hashes
|
||||
files_hash=""
|
||||
if [ -n "${{ inputs.key-files }}" ]; then
|
||||
IFS=',' read -ra FILES <<< "${{ inputs.key-files }}"
|
||||
for file in "${FILES[@]}"; do
|
||||
if [ -f "$file" ]; then
|
||||
file_hash=$(sha256sum "$file" | cut -d' ' -f1)
|
||||
files_hash="${files_hash}-${file_hash}"
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# Process environment variables
|
||||
env_hash=""
|
||||
if [ -n "${{ inputs.env-vars }}" ]; then
|
||||
IFS=',' read -ra VARS <<< "${{ inputs.env-vars }}"
|
||||
for var in "${VARS[@]}"; do
|
||||
if [ -n "${!var}" ]; then
|
||||
env_hash="${env_hash}-${var}-${!var}"
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
# Generate final cache key
|
||||
cache_key="${os_key}"
|
||||
[ -n "$prefix_key" ] && cache_key="${cache_key}-${prefix_key}"
|
||||
[ -n "$type_key" ] && cache_key="${cache_key}-${type_key}"
|
||||
[ -n "$files_hash" ] && cache_key="${cache_key}-${files_hash}"
|
||||
[ -n "$env_hash" ] && cache_key="${cache_key}-${env_hash}"
|
||||
|
||||
echo "cache-key=${cache_key}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Process cache paths
|
||||
IFS=',' read -ra PATHS <<< "${{ inputs.paths }}"
|
||||
cache_paths=""
|
||||
for path in "${PATHS[@]}"; do
|
||||
cache_paths="${cache_paths}${path}\n"
|
||||
done
|
||||
echo "cache-paths=${cache_paths}" >> $GITHUB_OUTPUT
|
||||
|
||||
- id: cache
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: ${{ steps.prepare.outputs.cache-paths }}
|
||||
key: ${{ steps.prepare.outputs.cache-key }}
|
||||
restore-keys: ${{ inputs.restore-keys }}
|
||||
36
common-file-check/README.md
Normal file
36
common-file-check/README.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# ivuorinen/actions/common-file-check
|
||||
|
||||
## Common File Check
|
||||
|
||||
### Description
|
||||
|
||||
A reusable action to check if a specific file or type of files exists in the repository.
|
||||
Emits an output 'found' which is true or false.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| -------------- | --------------------------------------- | -------- | ------- |
|
||||
| `file-pattern` | <p>Glob pattern for files to check.</p> | `true` | `""` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------- | -------------------------------------------------------------- |
|
||||
| `found` | <p>Indicates if the files matching the pattern were found.</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/common-file-check@main
|
||||
with:
|
||||
file-pattern:
|
||||
# Glob pattern for files to check.
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
```
|
||||
33
common-file-check/action.yml
Normal file
33
common-file-check/action.yml
Normal file
@@ -0,0 +1,33 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Common File Check
|
||||
description: |
|
||||
A reusable action to check if a specific file or type of files exists in the repository.
|
||||
Emits an output 'found' which is true or false.
|
||||
author: 'Ismo Vuorinen'
|
||||
branding:
|
||||
icon: search
|
||||
color: gray-dark
|
||||
|
||||
inputs:
|
||||
file-pattern:
|
||||
description: 'Glob pattern for files to check.'
|
||||
required: true
|
||||
|
||||
outputs:
|
||||
found:
|
||||
description: 'Indicates if the files matching the pattern were found.'
|
||||
value: ${{ steps.check-files.outputs.found }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Check for Files
|
||||
id: check-files
|
||||
shell: bash
|
||||
run: |
|
||||
if find . -name "${{ inputs.file-pattern }}" | grep -q .; then
|
||||
echo "found=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "found=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
17
compress-images/README.md
Normal file
17
compress-images/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/compress-images
|
||||
|
||||
## Compress Images
|
||||
|
||||
### Description
|
||||
|
||||
Compress images on demand (workflow_dispatch), and at 11pm every Sunday (schedule).
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/compress-images@main
|
||||
```
|
||||
@@ -1,39 +1,31 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
#
|
||||
# Compress images on demand (workflow_dispatch), and at 11pm every Sunday (schedule).
|
||||
# Open a Pull Request if any images can be compressed.
|
||||
name: Compress Images on Demand
|
||||
name: Compress Images
|
||||
description: Compress images on demand (workflow_dispatch), and at 11pm every Sunday (schedule).
|
||||
author: Ismo Vuorinen
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
workflow_call:
|
||||
schedule:
|
||||
- cron: '00 23 * * 0'
|
||||
|
||||
jobs:
|
||||
CompressOnDemandOrSchedule:
|
||||
name: calibreapp/image-actions
|
||||
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
statuses: write
|
||||
pull-requests: write
|
||||
branding:
|
||||
icon: image
|
||||
color: blue
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- uses: ivuorinen/actions/set-git-config@main
|
||||
- name: Set Git Config
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Checkout Repo
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Compress Images
|
||||
id: calibre
|
||||
uses: calibreapp/image-actions@main
|
||||
with:
|
||||
githubToken: ${{ secrets.GITHUB_TOKEN }}
|
||||
compressOnly: true
|
||||
githubToken: ${{ steps.set-git-config.outputs.token }}
|
||||
|
||||
- name: Create New Pull Request If Needed
|
||||
if: steps.calibre.outputs.markdown != ''
|
||||
|
||||
29
csharp-build/README.md
Normal file
29
csharp-build/README.md
Normal file
@@ -0,0 +1,29 @@
|
||||
# ivuorinen/actions/csharp-build
|
||||
|
||||
## C# Build
|
||||
|
||||
### Description
|
||||
|
||||
Builds and tests C# projects.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ---------------- | ---------------------------------- | -------- | ------- |
|
||||
| `dotnet-version` | <p>Version of .NET SDK to use.</p> | `false` | `""` |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/csharp-build@main
|
||||
with:
|
||||
dotnet-version:
|
||||
# Version of .NET SDK to use.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
```
|
||||
48
csharp-build/action.yml
Normal file
48
csharp-build/action.yml
Normal file
@@ -0,0 +1,48 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: C# Build
|
||||
description: 'Builds and tests C# projects.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'code'
|
||||
color: 'blue'
|
||||
|
||||
inputs:
|
||||
dotnet-version:
|
||||
description: 'Version of .NET SDK to use.'
|
||||
required: false
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect .NET SDK Version
|
||||
uses: ivuorinen/actions/dotnet-version-detect@main
|
||||
with:
|
||||
default-version: '7.0'
|
||||
|
||||
- name: Setup .NET SDK
|
||||
uses: actions/setup-dotnet@v3
|
||||
with:
|
||||
dotnet-version: '${{ steps.detect-dotnet-version.outputs.dotnet-version }}'
|
||||
|
||||
- name: Restore Dependencies
|
||||
shell: bash
|
||||
run: dotnet restore
|
||||
|
||||
- name: Build Solution
|
||||
shell: bash
|
||||
run: dotnet build --configuration Release --no-restore
|
||||
|
||||
- name: Run Tests
|
||||
shell: bash
|
||||
run: |
|
||||
dotnet test --configuration Release --no-build --collect:"XPlat Code Coverage" --logger "trx;LogFileName=test-results.trx"
|
||||
|
||||
- name: Upload Test Results
|
||||
uses: actions/upload-artifact@v3
|
||||
with:
|
||||
name: test-results
|
||||
path: |
|
||||
**/*.trx
|
||||
**/TestResults/**/coverage.cobertura.xml
|
||||
29
csharp-lint-check/README.md
Normal file
29
csharp-lint-check/README.md
Normal file
@@ -0,0 +1,29 @@
|
||||
# ivuorinen/actions/csharp-lint-check
|
||||
|
||||
## C# Lint Check
|
||||
|
||||
### Description
|
||||
|
||||
Runs linters like StyleCop or dotnet-format for C# code style checks.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ---------------- | ---------------------------------- | -------- | ------- |
|
||||
| `dotnet-version` | <p>Version of .NET SDK to use.</p> | `false` | `""` |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/csharp-lint-check@main
|
||||
with:
|
||||
dotnet-version:
|
||||
# Version of .NET SDK to use.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
```
|
||||
45
csharp-lint-check/action.yml
Normal file
45
csharp-lint-check/action.yml
Normal file
@@ -0,0 +1,45 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: 'C# Lint Check'
|
||||
description: 'Runs linters like StyleCop or dotnet-format for C# code style checks.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'code'
|
||||
color: 'blue'
|
||||
|
||||
inputs:
|
||||
dotnet-version:
|
||||
description: 'Version of .NET SDK to use.'
|
||||
required: false
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect .NET SDK Version
|
||||
uses: ivuorinen/actions/dotnet-version-detect@main
|
||||
with:
|
||||
default-version: '7.0'
|
||||
|
||||
- name: Setup .NET SDK
|
||||
uses: actions/setup-dotnet@v3
|
||||
with:
|
||||
dotnet-version: '${{ steps.detect-dotnet-version.outputs.dotnet-version }}'
|
||||
|
||||
- name: Install dotnet-format
|
||||
shell: bash
|
||||
run: dotnet tool install --global dotnet-format --version 7.0.1
|
||||
|
||||
- name: Run dotnet-format
|
||||
shell: bash
|
||||
run: |
|
||||
set -eo pipefail
|
||||
if ! dotnet format --check --report sarif --report-file dotnet-format.sarif; then
|
||||
echo "::error::Code formatting issues found. Check the SARIF report for details."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Upload SARIF Report
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: dotnet-format.sarif
|
||||
36
csharp-publish/README.md
Normal file
36
csharp-publish/README.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# ivuorinen/actions/csharp-publish
|
||||
|
||||
## C# Publish
|
||||
|
||||
### Description
|
||||
|
||||
Publishes a C# project to GitHub Packages.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ---------------- | ---------------------------------------- | -------- | ----------- |
|
||||
| `dotnet-version` | <p>Version of .NET SDK to use.</p> | `false` | `""` |
|
||||
| `namespace` | <p>GitHub namespace for the package.</p> | `true` | `ivuorinen` |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/csharp-publish@main
|
||||
with:
|
||||
dotnet-version:
|
||||
# Version of .NET SDK to use.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
namespace:
|
||||
# GitHub namespace for the package.
|
||||
#
|
||||
# Required: true
|
||||
# Default: ivuorinen
|
||||
```
|
||||
56
csharp-publish/action.yml
Normal file
56
csharp-publish/action.yml
Normal file
@@ -0,0 +1,56 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: C# Publish
|
||||
description: 'Publishes a C# project to GitHub Packages.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: package
|
||||
color: blue
|
||||
|
||||
inputs:
|
||||
dotnet-version:
|
||||
description: 'Version of .NET SDK to use.'
|
||||
required: false
|
||||
namespace:
|
||||
description: 'GitHub namespace for the package.'
|
||||
required: true
|
||||
default: 'ivuorinen'
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect .NET SDK Version
|
||||
uses: ivuorinen/actions/dotnet-version-detect@main
|
||||
with:
|
||||
default-version: '7.0'
|
||||
|
||||
- name: Setup .NET SDK
|
||||
uses: actions/setup-dotnet@v3
|
||||
with:
|
||||
dotnet-version: '${{ steps.detect-dotnet-version.outputs.dotnet-version }}'
|
||||
|
||||
- name: Restore Dependencies
|
||||
shell: bash
|
||||
run: dotnet restore
|
||||
|
||||
- name: Build Solution
|
||||
shell: bash
|
||||
run: dotnet build --configuration Release --no-restore
|
||||
|
||||
- name: Pack Solution
|
||||
shell: bash
|
||||
run: dotnet pack --configuration Release --output ./artifacts
|
||||
|
||||
- name: Publish Package
|
||||
shell: bash
|
||||
run: dotnet nuget push ./artifacts/*.nupkg \
|
||||
--api-key ${{ secrets.GITHUB_TOKEN }} \
|
||||
--source "https://nuget.pkg.github.com/${{ inputs.namespace }}/index.json" \
|
||||
--skip-duplicate \
|
||||
--no-symbols \
|
||||
|| (sleep 5 && dotnet nuget push ./artifacts/*.nupkg \
|
||||
--api-key ${{ secrets.GITHUB_TOKEN }} \
|
||||
--source "https://nuget.pkg.github.com/${{ inputs.namespace }}/index.json" \
|
||||
--skip-duplicate \
|
||||
--no-symbols)
|
||||
93
docker-build/README.md
Normal file
93
docker-build/README.md
Normal file
@@ -0,0 +1,93 @@
|
||||
# ivuorinen/actions/docker-build
|
||||
|
||||
## Docker Build
|
||||
|
||||
### Description
|
||||
|
||||
Builds a Docker image for multiple architectures with enhanced security and reliability.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| --------------- | ----------------------------------------------------------------------------------- | -------- | --------------------------------------------------- |
|
||||
| `image-name` | <p>The name of the Docker image to build. Defaults to the repository name.</p> | `false` | `""` |
|
||||
| `tag` | <p>The tag for the Docker image. Must follow semver or valid Docker tag format.</p> | `true` | `""` |
|
||||
| `architectures` | <p>Comma-separated list of architectures to build for.</p> | `false` | `linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6` |
|
||||
| `dockerfile` | <p>Path to the Dockerfile</p> | `false` | `Dockerfile` |
|
||||
| `context` | <p>Docker build context</p> | `false` | `.` |
|
||||
| `build-args` | <p>Build arguments in format KEY=VALUE,KEY2=VALUE2</p> | `false` | `""` |
|
||||
| `cache-from` | <p>External cache sources (e.g., type=registry,ref=user/app:cache)</p> | `false` | `""` |
|
||||
| `push` | <p>Whether to push the image after building</p> | `false` | `true` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts for build and push operations</p> | `false` | `3` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| -------------- | ------------------------------------ |
|
||||
| `image-digest` | <p>The digest of the built image</p> |
|
||||
| `metadata` | <p>Build metadata in JSON format</p> |
|
||||
| `platforms` | <p>Successfully built platforms</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/docker-build@main
|
||||
with:
|
||||
image-name:
|
||||
# The name of the Docker image to build. Defaults to the repository name.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
tag:
|
||||
# The tag for the Docker image. Must follow semver or valid Docker tag format.
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
architectures:
|
||||
# Comma-separated list of architectures to build for.
|
||||
#
|
||||
# Required: false
|
||||
# Default: linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6
|
||||
|
||||
dockerfile:
|
||||
# Path to the Dockerfile
|
||||
#
|
||||
# Required: false
|
||||
# Default: Dockerfile
|
||||
|
||||
context:
|
||||
# Docker build context
|
||||
#
|
||||
# Required: false
|
||||
# Default: .
|
||||
|
||||
build-args:
|
||||
# Build arguments in format KEY=VALUE,KEY2=VALUE2
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
cache-from:
|
||||
# External cache sources (e.g., type=registry,ref=user/app:cache)
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
push:
|
||||
# Whether to push the image after building
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts for build and push operations
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
```
|
||||
225
docker-build/action.yml
Normal file
225
docker-build/action.yml
Normal file
@@ -0,0 +1,225 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Docker Build
|
||||
description: 'Builds a Docker image for multiple architectures with enhanced security and reliability.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'package'
|
||||
color: 'blue'
|
||||
|
||||
inputs:
|
||||
image-name:
|
||||
description: 'The name of the Docker image to build. Defaults to the repository name.'
|
||||
required: false
|
||||
tag:
|
||||
description: 'The tag for the Docker image. Must follow semver or valid Docker tag format.'
|
||||
required: true
|
||||
architectures:
|
||||
description: 'Comma-separated list of architectures to build for.'
|
||||
required: false
|
||||
default: 'linux/amd64,linux/arm64,linux/arm/v7,linux/arm/v6'
|
||||
dockerfile:
|
||||
description: 'Path to the Dockerfile'
|
||||
required: false
|
||||
default: 'Dockerfile'
|
||||
context:
|
||||
description: 'Docker build context'
|
||||
required: false
|
||||
default: '.'
|
||||
build-args:
|
||||
description: 'Build arguments in format KEY=VALUE,KEY2=VALUE2'
|
||||
required: false
|
||||
cache-from:
|
||||
description: 'External cache sources (e.g., type=registry,ref=user/app:cache)'
|
||||
required: false
|
||||
push:
|
||||
description: 'Whether to push the image after building'
|
||||
required: false
|
||||
default: 'true'
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts for build and push operations'
|
||||
required: false
|
||||
default: '3'
|
||||
|
||||
outputs:
|
||||
image-digest:
|
||||
description: 'The digest of the built image'
|
||||
value: ${{ steps.build.outputs.digest }}
|
||||
metadata:
|
||||
description: 'Build metadata in JSON format'
|
||||
value: ${{ steps.build.outputs.metadata }}
|
||||
platforms:
|
||||
description: 'Successfully built platforms'
|
||||
value: ${{ steps.platforms.outputs.built }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate image name
|
||||
if [ -n "${{ inputs.image-name }}" ]; then
|
||||
if ! [[ "${{ inputs.image-name }}" =~ ^[a-z0-9]+(?:[._-][a-z0-9]+)*$ ]]; then
|
||||
echo "::error::Invalid image name format. Must match ^[a-z0-9]+(?:[._-][a-z0-9]+)*$"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Validate tag
|
||||
if ! [[ "${{ inputs.tag }}" =~ ^(v?[0-9]+\.[0-9]+\.[0-9]+(-[\w.]+)?(\+[\w.]+)?|latest|[a-zA-Z][-a-zA-Z0-9._]{0,127})$ ]]; then
|
||||
echo "::error::Invalid tag format. Must be semver or valid Docker tag"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate architectures
|
||||
IFS=',' read -ra ARCHS <<< "${{ inputs.architectures }}"
|
||||
for arch in "${ARCHS[@]}"; do
|
||||
if ! [[ "$arch" =~ ^linux/(amd64|arm64|arm/v7|arm/v6|386|ppc64le|s390x)$ ]]; then
|
||||
echo "::error::Invalid architecture format: $arch"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Validate Dockerfile existence
|
||||
if [ ! -f "${{ inputs.dockerfile }}" ]; then
|
||||
echo "::error::Dockerfile not found at ${{ inputs.dockerfile }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
with:
|
||||
platforms: ${{ inputs.architectures }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
id: buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
with:
|
||||
version: latest
|
||||
platforms: ${{ inputs.architectures }}
|
||||
|
||||
- name: Determine Image Name
|
||||
id: image-name
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
if [ -z "${{ inputs.image-name }}" ]; then
|
||||
repo_name=$(basename "${GITHUB_REPOSITORY}")
|
||||
echo "name=${repo_name}" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "name=${{ inputs.image-name }}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Parse Build Arguments
|
||||
id: build-args
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
args=""
|
||||
if [ -n "${{ inputs.build-args }}" ]; then
|
||||
IFS=',' read -ra BUILD_ARGS <<< "${{ inputs.build-args }}"
|
||||
for arg in "${BUILD_ARGS[@]}"; do
|
||||
args="$args --build-arg $arg"
|
||||
done
|
||||
fi
|
||||
echo "args=${args}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Set up Build Cache
|
||||
id: cache
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cache_from=""
|
||||
if [ -n "${{ inputs.cache-from }}" ]; then
|
||||
cache_from="--cache-from ${{ inputs.cache-from }}"
|
||||
fi
|
||||
|
||||
# Local cache configuration
|
||||
cache_from="$cache_from --cache-from type=local,src=/tmp/.buildx-cache"
|
||||
cache_to="--cache-to type=local,dest=/tmp/.buildx-cache-new,mode=max"
|
||||
|
||||
echo "from=${cache_from}" >> $GITHUB_OUTPUT
|
||||
echo "to=${cache_to}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Build Multi-Architecture Docker Image
|
||||
id: build
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
attempt=1
|
||||
max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Build attempt $attempt of $max_attempts"
|
||||
|
||||
if docker buildx build \
|
||||
--platform=${{ inputs.architectures }} \
|
||||
--tag ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} \
|
||||
${{ steps.build-args.outputs.args }} \
|
||||
${{ steps.cache.outputs.from }} \
|
||||
${{ steps.cache.outputs.to }} \
|
||||
--file ${{ inputs.dockerfile }} \
|
||||
${{ inputs.push == 'true' && '--push' || '--load' }} \
|
||||
--provenance=true \
|
||||
--sbom=true \
|
||||
${{ inputs.context }}; then
|
||||
|
||||
# Get image digest
|
||||
digest=$(docker buildx imagetools inspect ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} --raw)
|
||||
echo "digest=${digest}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Move cache
|
||||
rm -rf /tmp/.buildx-cache
|
||||
mv /tmp/.buildx-cache-new /tmp/.buildx-cache
|
||||
|
||||
break
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Build failed, waiting 10 seconds before retry..."
|
||||
sleep 10
|
||||
else
|
||||
echo "::error::Build failed after $max_attempts attempts"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Verify Build
|
||||
id: verify
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify image exists
|
||||
if ! docker buildx imagetools inspect ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} >/dev/null 2>&1; then
|
||||
echo "::error::Built image not found"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Get and verify platform support
|
||||
platforms=$(docker buildx imagetools inspect ${{ steps.image-name.outputs.name }}:${{ inputs.tag }} | grep "Platform:" | cut -d' ' -f2)
|
||||
echo "built=${platforms}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Cleanup temporary files
|
||||
rm -rf /tmp/.buildx-cache*
|
||||
|
||||
# Remove builder instance if created
|
||||
if docker buildx ls | grep -q builder; then
|
||||
docker buildx rm builder || true
|
||||
fi
|
||||
95
docker-publish-gh/README.md
Normal file
95
docker-publish-gh/README.md
Normal file
@@ -0,0 +1,95 @@
|
||||
# ivuorinen/actions/docker-publish-gh
|
||||
|
||||
## Docker Publish to GitHub Packages
|
||||
|
||||
### Description
|
||||
|
||||
Publishes a Docker image to GitHub Packages with advanced security and reliability features.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------- | -------------------------------------------------------------------------------- | -------- | ------------------------- |
|
||||
| `image-name` | <p>The name of the Docker image to publish. Defaults to the repository name.</p> | `false` | `""` |
|
||||
| `tags` | <p>Comma-separated list of tags for the Docker image.</p> | `true` | `""` |
|
||||
| `platforms` | <p>Platforms to publish (comma-separated). Defaults to amd64 and arm64.</p> | `false` | `linux/amd64,linux/arm64` |
|
||||
| `registry` | <p>GitHub Container Registry URL</p> | `false` | `ghcr.io` |
|
||||
| `token` | <p>GitHub token with package write permissions</p> | `false` | `${{ github.token }}` |
|
||||
| `provenance` | <p>Enable SLSA provenance generation</p> | `false` | `true` |
|
||||
| `sbom` | <p>Generate Software Bill of Materials</p> | `false` | `true` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts for publishing</p> | `false` | `3` |
|
||||
| `retry-delay` | <p>Delay in seconds between retries</p> | `false` | `10` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------ | ----------------------------------------- |
|
||||
| `image-name` | <p>Full image name including registry</p> |
|
||||
| `digest` | <p>The digest of the published image</p> |
|
||||
| `tags` | <p>List of published tags</p> |
|
||||
| `provenance` | <p>SLSA provenance attestation</p> |
|
||||
| `sbom` | <p>SBOM document location</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/docker-publish-gh@main
|
||||
with:
|
||||
image-name:
|
||||
# The name of the Docker image to publish. Defaults to the repository name.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
tags:
|
||||
# Comma-separated list of tags for the Docker image.
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
platforms:
|
||||
# Platforms to publish (comma-separated). Defaults to amd64 and arm64.
|
||||
#
|
||||
# Required: false
|
||||
# Default: linux/amd64,linux/arm64
|
||||
|
||||
registry:
|
||||
# GitHub Container Registry URL
|
||||
#
|
||||
# Required: false
|
||||
# Default: ghcr.io
|
||||
|
||||
token:
|
||||
# GitHub token with package write permissions
|
||||
#
|
||||
# Required: false
|
||||
# Default: ${{ github.token }}
|
||||
|
||||
provenance:
|
||||
# Enable SLSA provenance generation
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
sbom:
|
||||
# Generate Software Bill of Materials
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts for publishing
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
|
||||
retry-delay:
|
||||
# Delay in seconds between retries
|
||||
#
|
||||
# Required: false
|
||||
# Default: 10
|
||||
```
|
||||
234
docker-publish-gh/action.yml
Normal file
234
docker-publish-gh/action.yml
Normal file
@@ -0,0 +1,234 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Docker Publish to GitHub Packages
|
||||
description: 'Publishes a Docker image to GitHub Packages with advanced security and reliability features.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'package'
|
||||
color: 'blue'
|
||||
|
||||
inputs:
|
||||
image-name:
|
||||
description: 'The name of the Docker image to publish. Defaults to the repository name.'
|
||||
required: false
|
||||
tags:
|
||||
description: 'Comma-separated list of tags for the Docker image.'
|
||||
required: true
|
||||
platforms:
|
||||
description: 'Platforms to publish (comma-separated). Defaults to amd64 and arm64.'
|
||||
required: false
|
||||
default: 'linux/amd64,linux/arm64'
|
||||
registry:
|
||||
description: 'GitHub Container Registry URL'
|
||||
required: false
|
||||
default: 'ghcr.io'
|
||||
token:
|
||||
description: 'GitHub token with package write permissions'
|
||||
required: false
|
||||
default: ${{ github.token }}
|
||||
provenance:
|
||||
description: 'Enable SLSA provenance generation'
|
||||
required: false
|
||||
default: 'true'
|
||||
sbom:
|
||||
description: 'Generate Software Bill of Materials'
|
||||
required: false
|
||||
default: 'true'
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts for publishing'
|
||||
required: false
|
||||
default: '3'
|
||||
retry-delay:
|
||||
description: 'Delay in seconds between retries'
|
||||
required: false
|
||||
default: '10'
|
||||
|
||||
outputs:
|
||||
image-name:
|
||||
description: 'Full image name including registry'
|
||||
value: ${{ steps.metadata.outputs.full-name }}
|
||||
digest:
|
||||
description: 'The digest of the published image'
|
||||
value: ${{ steps.publish.outputs.digest }}
|
||||
tags:
|
||||
description: 'List of published tags'
|
||||
value: ${{ steps.metadata.outputs.tags }}
|
||||
provenance:
|
||||
description: 'SLSA provenance attestation'
|
||||
value: ${{ steps.publish.outputs.provenance }}
|
||||
sbom:
|
||||
description: 'SBOM document location'
|
||||
value: ${{ steps.publish.outputs.sbom }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate image name format
|
||||
if [ -n "${{ inputs.image-name }}" ]; then
|
||||
if ! [[ "${{ inputs.image-name }}" =~ ^[a-z0-9]+(?:[._-][a-z0-9]+)*$ ]]; then
|
||||
echo "::error::Invalid image name format"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Validate tags
|
||||
IFS=',' read -ra TAGS <<< "${{ inputs.tags }}"
|
||||
for tag in "${TAGS[@]}"; do
|
||||
if ! [[ "$tag" =~ ^(v?[0-9]+\.[0-9]+\.[0-9]+(-[\w.]+)?(\+[\w.]+)?|latest|[a-zA-Z][-a-zA-Z0-9._]{0,127})$ ]]; then
|
||||
echo "::error::Invalid tag format: $tag"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Validate platforms
|
||||
IFS=',' read -ra PLATFORMS <<< "${{ inputs.platforms }}"
|
||||
for platform in "${PLATFORMS[@]}"; do
|
||||
if ! [[ "$platform" =~ ^linux/(amd64|arm64|arm/v7|arm/v6|386|ppc64le|s390x)$ ]]; then
|
||||
echo "::error::Invalid platform: $platform"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
with:
|
||||
platforms: ${{ inputs.platforms }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
with:
|
||||
platforms: ${{ inputs.platforms }}
|
||||
|
||||
- name: Prepare Metadata
|
||||
id: metadata
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Determine image name
|
||||
if [ -z "${{ inputs.image-name }}" ]; then
|
||||
image_name=$(basename $GITHUB_REPOSITORY)
|
||||
else
|
||||
image_name="${{ inputs.image-name }}"
|
||||
fi
|
||||
|
||||
# Construct full image name with registry
|
||||
full_name="${{ inputs.registry }}/${{ github.repository_owner }}/${image_name}"
|
||||
echo "full-name=${full_name}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Process tags
|
||||
processed_tags=""
|
||||
IFS=',' read -ra TAGS <<< "${{ inputs.tags }}"
|
||||
for tag in "${TAGS[@]}"; do
|
||||
processed_tags="${processed_tags}${full_name}:${tag},"
|
||||
done
|
||||
processed_tags=${processed_tags%,}
|
||||
echo "tags=${processed_tags}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Log in to GitHub Container Registry
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
registry: ${{ inputs.registry }}
|
||||
username: ${{ github.actor }}
|
||||
password: ${{ inputs.token }}
|
||||
|
||||
- name: Set up Cosign
|
||||
if: inputs.provenance == 'true'
|
||||
uses: sigstore/cosign-installer@v3
|
||||
|
||||
- name: Publish Image
|
||||
id: publish
|
||||
shell: bash
|
||||
env:
|
||||
DOCKER_BUILDKIT: 1
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
attempt=1
|
||||
max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Publishing attempt $attempt of $max_attempts"
|
||||
|
||||
if docker buildx build \
|
||||
--platform=${{ inputs.platforms }} \
|
||||
--tag ${{ steps.metadata.outputs.tags }} \
|
||||
--push \
|
||||
${{ inputs.provenance == 'true' && '--provenance=true' || '' }} \
|
||||
${{ inputs.sbom == 'true' && '--sbom=true' || '' }} \
|
||||
--label "org.opencontainers.image.source=${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}" \
|
||||
--label "org.opencontainers.image.created=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" \
|
||||
--label "org.opencontainers.image.revision=${GITHUB_SHA}" \
|
||||
.; then
|
||||
|
||||
# Get image digest
|
||||
digest=$(docker buildx imagetools inspect ${{ steps.metadata.outputs.full-name }}:${TAGS[0]} --raw)
|
||||
echo "digest=${digest}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Generate attestations if enabled
|
||||
if [[ "${{ inputs.provenance }}" == "true" ]]; then
|
||||
cosign verify-attestation \
|
||||
--type slsaprovenance \
|
||||
${{ steps.metadata.outputs.full-name }}@${digest}
|
||||
echo "provenance=true" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
if [[ "${{ inputs.sbom }}" == "true" ]]; then
|
||||
sbom_path="ghcr.io/${{ github.repository_owner }}/${image_name}.sbom"
|
||||
echo "sbom=${sbom_path}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
break
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Publish failed, waiting ${{ inputs.retry-delay }} seconds before retry..."
|
||||
sleep ${{ inputs.retry-delay }}
|
||||
else
|
||||
echo "::error::Publishing failed after $max_attempts attempts"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Verify Publication
|
||||
id: verify
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify image existence and accessibility
|
||||
IFS=',' read -ra TAGS <<< "${{ inputs.tags }}"
|
||||
for tag in "${TAGS[@]}"; do
|
||||
if ! docker buildx imagetools inspect ${{ steps.metadata.outputs.full-name }}:${tag} >/dev/null 2>&1; then
|
||||
echo "::error::Published image not found: $tag"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Verify platforms
|
||||
IFS=',' read -ra PLATFORMS <<< "${{ inputs.platforms }}"
|
||||
for platform in "${PLATFORMS[@]}"; do
|
||||
if ! docker buildx imagetools inspect ${{ steps.metadata.outputs.full-name }}:${TAGS[0]} | grep -q "$platform"; then
|
||||
echo "::warning::Platform $platform not found in published image"
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Clean up
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Remove temporary files and cleanup Docker cache
|
||||
docker buildx prune -f --keep-storage=10GB
|
||||
|
||||
# Logout from registry
|
||||
docker logout ${{ inputs.registry }}
|
||||
108
docker-publish-hub/README.md
Normal file
108
docker-publish-hub/README.md
Normal file
@@ -0,0 +1,108 @@
|
||||
# ivuorinen/actions/docker-publish-hub
|
||||
|
||||
## Docker Publish to Docker Hub
|
||||
|
||||
### Description
|
||||
|
||||
Publishes a Docker image to Docker Hub with enhanced security and reliability features.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------------------ | -------------------------------------------------------------------------------- | -------- | ------------------------- |
|
||||
| `image-name` | <p>The name of the Docker image to publish. Defaults to the repository name.</p> | `false` | `""` |
|
||||
| `tags` | <p>Comma-separated list of tags for the Docker image.</p> | `true` | `""` |
|
||||
| `platforms` | <p>Platforms to publish (comma-separated). Defaults to amd64 and arm64.</p> | `false` | `linux/amd64,linux/arm64` |
|
||||
| `username` | <p>Docker Hub username</p> | `true` | `""` |
|
||||
| `password` | <p>Docker Hub password or access token</p> | `true` | `""` |
|
||||
| `repository-description` | <p>Update Docker Hub repository description</p> | `false` | `""` |
|
||||
| `readme-file` | <p>Path to README file to update on Docker Hub</p> | `false` | `README.md` |
|
||||
| `provenance` | <p>Enable SLSA provenance generation</p> | `false` | `true` |
|
||||
| `sbom` | <p>Generate Software Bill of Materials</p> | `false` | `true` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts for publishing</p> | `false` | `3` |
|
||||
| `retry-delay` | <p>Delay in seconds between retries</p> | `false` | `10` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------ | ----------------------------------------- |
|
||||
| `image-name` | <p>Full image name including registry</p> |
|
||||
| `digest` | <p>The digest of the published image</p> |
|
||||
| `tags` | <p>List of published tags</p> |
|
||||
| `repo-url` | <p>Docker Hub repository URL</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/docker-publish-hub@main
|
||||
with:
|
||||
image-name:
|
||||
# The name of the Docker image to publish. Defaults to the repository name.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
tags:
|
||||
# Comma-separated list of tags for the Docker image.
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
platforms:
|
||||
# Platforms to publish (comma-separated). Defaults to amd64 and arm64.
|
||||
#
|
||||
# Required: false
|
||||
# Default: linux/amd64,linux/arm64
|
||||
|
||||
username:
|
||||
# Docker Hub username
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
password:
|
||||
# Docker Hub password or access token
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
repository-description:
|
||||
# Update Docker Hub repository description
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
readme-file:
|
||||
# Path to README file to update on Docker Hub
|
||||
#
|
||||
# Required: false
|
||||
# Default: README.md
|
||||
|
||||
provenance:
|
||||
# Enable SLSA provenance generation
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
sbom:
|
||||
# Generate Software Bill of Materials
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts for publishing
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
|
||||
retry-delay:
|
||||
# Delay in seconds between retries
|
||||
#
|
||||
# Required: false
|
||||
# Default: 10
|
||||
```
|
||||
258
docker-publish-hub/action.yml
Normal file
258
docker-publish-hub/action.yml
Normal file
@@ -0,0 +1,258 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Docker Publish to Docker Hub
|
||||
description: 'Publishes a Docker image to Docker Hub with enhanced security and reliability features.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'package'
|
||||
color: 'blue'
|
||||
|
||||
inputs:
|
||||
image-name:
|
||||
description: 'The name of the Docker image to publish. Defaults to the repository name.'
|
||||
required: false
|
||||
tags:
|
||||
description: 'Comma-separated list of tags for the Docker image.'
|
||||
required: true
|
||||
platforms:
|
||||
description: 'Platforms to publish (comma-separated). Defaults to amd64 and arm64.'
|
||||
required: false
|
||||
default: 'linux/amd64,linux/arm64'
|
||||
username:
|
||||
description: 'Docker Hub username'
|
||||
required: true
|
||||
password:
|
||||
description: 'Docker Hub password or access token'
|
||||
required: true
|
||||
repository-description:
|
||||
description: 'Update Docker Hub repository description'
|
||||
required: false
|
||||
readme-file:
|
||||
description: 'Path to README file to update on Docker Hub'
|
||||
required: false
|
||||
default: 'README.md'
|
||||
provenance:
|
||||
description: 'Enable SLSA provenance generation'
|
||||
required: false
|
||||
default: 'true'
|
||||
sbom:
|
||||
description: 'Generate Software Bill of Materials'
|
||||
required: false
|
||||
default: 'true'
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts for publishing'
|
||||
required: false
|
||||
default: '3'
|
||||
retry-delay:
|
||||
description: 'Delay in seconds between retries'
|
||||
required: false
|
||||
default: '10'
|
||||
|
||||
outputs:
|
||||
image-name:
|
||||
description: 'Full image name including registry'
|
||||
value: ${{ steps.metadata.outputs.full-name }}
|
||||
digest:
|
||||
description: 'The digest of the published image'
|
||||
value: ${{ steps.publish.outputs.digest }}
|
||||
tags:
|
||||
description: 'List of published tags'
|
||||
value: ${{ steps.metadata.outputs.tags }}
|
||||
repo-url:
|
||||
description: 'Docker Hub repository URL'
|
||||
value: ${{ steps.metadata.outputs.repo-url }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate image name format
|
||||
if [ -n "${{ inputs.image-name }}" ]; then
|
||||
if ! [[ "${{ inputs.image-name }}" =~ ^[a-z0-9]+(?:[._-][a-z0-9]+)*$ ]]; then
|
||||
echo "::error::Invalid image name format"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Validate tags
|
||||
IFS=',' read -ra TAGS <<< "${{ inputs.tags }}"
|
||||
for tag in "${TAGS[@]}"; do
|
||||
if ! [[ "$tag" =~ ^(v?[0-9]+\.[0-9]+\.[0-9]+(-[\w.]+)?(\+[\w.]+)?|latest|[a-zA-Z][-a-zA-Z0-9._]{0,127})$ ]]; then
|
||||
echo "::error::Invalid tag format: $tag"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Validate platforms
|
||||
IFS=',' read -ra PLATFORMS <<< "${{ inputs.platforms }}"
|
||||
for platform in "${PLATFORMS[@]}"; do
|
||||
if ! [[ "$platform" =~ ^linux/(amd64|arm64|arm/v7|arm/v6|386|ppc64le|s390x)$ ]]; then
|
||||
echo "::error::Invalid platform: $platform"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Validate credentials (without exposing them)
|
||||
if [ -z "${{ inputs.username }}" ] || [ -z "${{ inputs.password }}" ]; then
|
||||
echo "::error::Docker Hub credentials are required"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Set up QEMU
|
||||
uses: docker/setup-qemu-action@v3
|
||||
with:
|
||||
platforms: ${{ inputs.platforms }}
|
||||
|
||||
- name: Set up Docker Buildx
|
||||
uses: docker/setup-buildx-action@v3
|
||||
with:
|
||||
platforms: ${{ inputs.platforms }}
|
||||
|
||||
- name: Prepare Metadata
|
||||
id: metadata
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Determine image name
|
||||
if [ -z "${{ inputs.image-name }}" ]; then
|
||||
image_name=$(basename $GITHUB_REPOSITORY)
|
||||
else
|
||||
image_name="${{ inputs.image-name }}"
|
||||
fi
|
||||
|
||||
# Construct full image name
|
||||
full_name="${{ inputs.username }}/${image_name}"
|
||||
echo "full-name=${full_name}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Process tags
|
||||
processed_tags=""
|
||||
IFS=',' read -ra TAGS <<< "${{ inputs.tags }}"
|
||||
for tag in "${TAGS[@]}"; do
|
||||
processed_tags="${processed_tags}${full_name}:${tag},"
|
||||
done
|
||||
processed_tags=${processed_tags%,}
|
||||
echo "tags=${processed_tags}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Generate repository URL
|
||||
echo "repo-url=https://hub.docker.com/r/${full_name}" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Log in to Docker Hub
|
||||
uses: docker/login-action@v3
|
||||
with:
|
||||
username: ${{ inputs.username }}
|
||||
password: ${{ inputs.password }}
|
||||
|
||||
- name: Set up Cosign
|
||||
if: inputs.provenance == 'true'
|
||||
uses: sigstore/cosign-installer@v3
|
||||
|
||||
- name: Update Docker Hub Description
|
||||
if: inputs.repository-description != '' || inputs.readme-file != ''
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Install Docker Hub API client
|
||||
pip install docker-hub-api
|
||||
|
||||
# Update repository description
|
||||
if [ -n "${{ inputs.repository-description }}" ]; then
|
||||
docker-hub-api update-repo \
|
||||
--user "${{ inputs.username }}" \
|
||||
--password "${{ inputs.password }}" \
|
||||
--name "${{ steps.metadata.outputs.full-name }}" \
|
||||
--description "${{ inputs.repository-description }}"
|
||||
fi
|
||||
|
||||
# Update README
|
||||
if [ -f "${{ inputs.readme-file }}" ]; then
|
||||
docker-hub-api update-repo \
|
||||
--user "${{ inputs.username }}" \
|
||||
--password "${{ inputs.password }}" \
|
||||
--name "${{ steps.metadata.outputs.full-name }}" \
|
||||
--full-description "$(cat ${{ inputs.readme-file }})"
|
||||
fi
|
||||
|
||||
- name: Publish Image
|
||||
id: publish
|
||||
shell: bash
|
||||
env:
|
||||
DOCKER_BUILDKIT: 1
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
attempt=1
|
||||
max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Publishing attempt $attempt of $max_attempts"
|
||||
|
||||
if docker buildx build \
|
||||
--platform=${{ inputs.platforms }} \
|
||||
--tag ${{ steps.metadata.outputs.tags }} \
|
||||
--push \
|
||||
${{ inputs.provenance == 'true' && '--provenance=true' || '' }} \
|
||||
${{ inputs.sbom == 'true' && '--sbom=true' || '' }} \
|
||||
--label "org.opencontainers.image.source=${GITHUB_SERVER_URL}/${GITHUB_REPOSITORY}" \
|
||||
--label "org.opencontainers.image.created=$(date -u +'%Y-%m-%dT%H:%M:%SZ')" \
|
||||
--label "org.opencontainers.image.revision=${GITHUB_SHA}" \
|
||||
.; then
|
||||
|
||||
# Get image digest
|
||||
digest=$(docker buildx imagetools inspect ${{ steps.metadata.outputs.full-name }}:${TAGS[0]} --raw)
|
||||
echo "digest=${digest}" >> $GITHUB_OUTPUT
|
||||
|
||||
break
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Publish failed, waiting ${{ inputs.retry-delay }} seconds before retry..."
|
||||
sleep ${{ inputs.retry-delay }}
|
||||
else
|
||||
echo "::error::Publishing failed after $max_attempts attempts"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Verify Publication
|
||||
id: verify
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify image existence and accessibility
|
||||
IFS=',' read -ra TAGS <<< "${{ inputs.tags }}"
|
||||
for tag in "${TAGS[@]}"; do
|
||||
if ! docker buildx imagetools inspect ${{ steps.metadata.outputs.full-name }}:${tag} >/dev/null 2>&1; then
|
||||
echo "::error::Published image not found: $tag"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
# Verify platforms
|
||||
IFS=',' read -ra PLATFORMS <<< "${{ inputs.platforms }}"
|
||||
for platform in "${PLATFORMS[@]}"; do
|
||||
if ! docker buildx imagetools inspect ${{ steps.metadata.outputs.full-name }}:${TAGS[0]} | grep -q "$platform"; then
|
||||
echo "::warning::Platform $platform not found in published image"
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Clean up
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Remove temporary files and cleanup Docker cache
|
||||
docker buildx prune -f --keep-storage=10GB
|
||||
|
||||
# Logout from Docker Hub
|
||||
docker logout
|
||||
42
docker-publish/README.md
Normal file
42
docker-publish/README.md
Normal file
@@ -0,0 +1,42 @@
|
||||
# ivuorinen/actions/docker-publish
|
||||
|
||||
## Docker Publish
|
||||
|
||||
### Description
|
||||
|
||||
Publish a Docker image to GitHub Packages and Docker Hub.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ---------- | ----------------------------------------------------------- | -------- | ------- |
|
||||
| `registry` | <p>Registry to publish to (dockerhub, github, or both).</p> | `true` | `both` |
|
||||
| `nightly` | <p>Is this a nightly build? (true or false)</p> | `false` | `false` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ---------- | ----------------------------------------- |
|
||||
| `registry` | <p>Registry where image was published</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/docker-publish@main
|
||||
with:
|
||||
registry:
|
||||
# Registry to publish to (dockerhub, github, or both).
|
||||
#
|
||||
# Required: true
|
||||
# Default: both
|
||||
|
||||
nightly:
|
||||
# Is this a nightly build? (true or false)
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
```
|
||||
158
docker-publish/action.yml
Normal file
158
docker-publish/action.yml
Normal file
@@ -0,0 +1,158 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Docker Publish
|
||||
description: Publish a Docker image to GitHub Packages and Docker Hub.
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: upload-cloud
|
||||
color: blue
|
||||
|
||||
inputs:
|
||||
registry:
|
||||
description: 'Registry to publish to (dockerhub, github, or both).'
|
||||
required: true
|
||||
default: 'both'
|
||||
nightly:
|
||||
description: 'Is this a nightly build? (true or false)'
|
||||
required: false
|
||||
default: 'false'
|
||||
|
||||
outputs:
|
||||
registry:
|
||||
description: 'Registry where image was published'
|
||||
value: ${{ steps.dest.outputs.reg }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate registry input
|
||||
if ! [[ "${{ inputs.registry }}" =~ ^(dockerhub|github|both)$ ]]; then
|
||||
echo "::error::Invalid registry value. Must be 'dockerhub', 'github', or 'both'"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Determine Tags
|
||||
id: tags
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Initialize variables
|
||||
declare -a tag_array
|
||||
|
||||
if [[ "${{ inputs.nightly }}" == "true" ]]; then
|
||||
# Nightly build tags
|
||||
current_date=$(date +'%Y%m%d-%H%M')
|
||||
tag_array+=("nightly")
|
||||
tag_array+=("nightly-${current_date}")
|
||||
else
|
||||
# Release tags
|
||||
if [[ -n "${{ github.event.release.tag_name }}" ]]; then
|
||||
tag_array+=("${{ github.event.release.tag_name }}")
|
||||
tag_array+=("latest")
|
||||
else
|
||||
echo "::error::No release tag found and not a nightly build"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
# Join tags with comma
|
||||
tags=$(IFS=,; echo "${tag_array[*]}")
|
||||
echo "all-tags=${tags}" >> "$GITHUB_OUTPUT"
|
||||
echo "Generated tags: ${tags}"
|
||||
|
||||
- name: Determine Publish Destination
|
||||
id: dest
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
if [[ "${{ inputs.registry }}" == "both" ]]; then
|
||||
echo "reg=github,dockerhub" >> "$GITHUB_OUTPUT"
|
||||
else
|
||||
echo "reg=${{ inputs.registry }}" >> "$GITHUB_OUTPUT"
|
||||
fi
|
||||
|
||||
echo "Publishing to: ${{ inputs.registry }}"
|
||||
|
||||
- name: Build Multi-Arch Docker Image
|
||||
uses: ivuorinen/actions/docker-build@main
|
||||
with:
|
||||
tag: ${{ steps.tags.outputs.all-tags }}
|
||||
|
||||
- name: Publish to Docker Hub
|
||||
if: contains(steps.dest.outputs.reg, 'dockerhub')
|
||||
uses: ivuorinen/actions/docker-publish-hub@main
|
||||
with:
|
||||
tags: ${{ steps.tags.outputs.all-tags }}
|
||||
|
||||
- name: Publish to GitHub Packages
|
||||
if: contains(steps.dest.outputs.reg, 'github')
|
||||
uses: ivuorinen/actions/docker-publish-gh@main
|
||||
with:
|
||||
tags: ${{ steps.tags.outputs.all-tags }}
|
||||
|
||||
- name: Verify Publications
|
||||
id: verify
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
echo "Verifying publications..."
|
||||
success=true
|
||||
|
||||
# Split registry string into array
|
||||
IFS=',' read -ra REGISTRIES <<< "${{ steps.dest.outputs.reg }}"
|
||||
|
||||
for registry in "${REGISTRIES[@]}"; do
|
||||
echo "Checking ${registry} publication..."
|
||||
case "${registry}" in
|
||||
"dockerhub")
|
||||
if ! curl -s "https://hub.docker.com/v2/repositories/${{ github.repository }}/tags/" | grep -q "${{ steps.tags.outputs.all-tags }}"; then
|
||||
echo "::error::Failed to verify Docker Hub publication"
|
||||
success=false
|
||||
fi
|
||||
;;
|
||||
"github")
|
||||
if ! gh api "/packages/container/${github.repository}/versions" | grep -q "${{ steps.tags.outputs.all-tags }}"; then
|
||||
echo "::error::Failed to verify GitHub Packages publication"
|
||||
success=false
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
done
|
||||
|
||||
if [[ "${success}" != "true" ]]; then
|
||||
echo "::error::Publication verification failed"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "All publications verified successfully"
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
echo "Cleaning up..."
|
||||
|
||||
# Remove any temporary files or caches
|
||||
docker buildx prune -f --keep-storage=10GB
|
||||
|
||||
# Remove any temporary authentication
|
||||
if [[ "${{ steps.dest.outputs.reg }}" =~ "dockerhub" ]]; then
|
||||
docker logout docker.io || true
|
||||
fi
|
||||
if [[ "${{ steps.dest.outputs.reg }}" =~ "github" ]]; then
|
||||
docker logout ghcr.io || true
|
||||
fi
|
||||
|
||||
echo "Cleanup completed"
|
||||
35
dotnet-version-detect/README.md
Normal file
35
dotnet-version-detect/README.md
Normal file
@@ -0,0 +1,35 @@
|
||||
# ivuorinen/actions/dotnet-version-detect
|
||||
|
||||
## Dotnet Version Detect
|
||||
|
||||
### Description
|
||||
|
||||
Detects .NET SDK version from global.json or defaults to a specified version.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ----------------- | ------------------------------------------------------------------- | -------- | ------- |
|
||||
| `default-version` | <p>Default .NET SDK version to use if global.json is not found.</p> | `true` | `7.0` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ---------------- | -------------------------------------------- |
|
||||
| `dotnet-version` | <p>Detected or default .NET SDK version.</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/dotnet-version-detect@main
|
||||
with:
|
||||
default-version:
|
||||
# Default .NET SDK version to use if global.json is not found.
|
||||
#
|
||||
# Required: true
|
||||
# Default: 7.0
|
||||
```
|
||||
36
dotnet-version-detect/action.yml
Normal file
36
dotnet-version-detect/action.yml
Normal file
@@ -0,0 +1,36 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Dotnet Version Detect
|
||||
description: 'Detects .NET SDK version from global.json or defaults to a specified version.'
|
||||
|
||||
inputs:
|
||||
default-version:
|
||||
description: 'Default .NET SDK version to use if global.json is not found.'
|
||||
required: true
|
||||
default: '7.0'
|
||||
|
||||
outputs:
|
||||
dotnet-version:
|
||||
description: 'Detected or default .NET SDK version.'
|
||||
value: ${{ steps.detect-dotnet-version.outputs.dotnet-version }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect .NET SDK Version
|
||||
id: detect-dotnet-version
|
||||
shell: bash
|
||||
run: |
|
||||
if [ -f global.json ]; then
|
||||
version=$(jq -r '.sdk.version' global.json)
|
||||
if [ "$version" != "null" ]; then
|
||||
echo "Detected .NET SDK version: $version"
|
||||
echo "dotnet-version=$version" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "No version specified in global.json. Using default."
|
||||
echo "dotnet-version=${{ inputs.default-version }}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
else
|
||||
echo "global.json not found. Using default."
|
||||
echo "dotnet-version=${{ inputs.default-version }}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
101
eslint-check/README.md
Normal file
101
eslint-check/README.md
Normal file
@@ -0,0 +1,101 @@
|
||||
# ivuorinen/actions/eslint-check
|
||||
|
||||
## ESLint Check
|
||||
|
||||
### Description
|
||||
|
||||
Run ESLint check on the repository with advanced configuration and reporting
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------------- | ------------------------------------------------ | -------- | ------------------- |
|
||||
| `working-directory` | <p>Directory containing files to lint</p> | `false` | `.` |
|
||||
| `eslint-version` | <p>ESLint version to use</p> | `false` | `latest` |
|
||||
| `config-file` | <p>Path to ESLint config file</p> | `false` | `.eslintrc` |
|
||||
| `ignore-file` | <p>Path to ESLint ignore file</p> | `false` | `.eslintignore` |
|
||||
| `file-extensions` | <p>File extensions to lint (comma-separated)</p> | `false` | `.js,.jsx,.ts,.tsx` |
|
||||
| `cache` | <p>Enable ESLint caching</p> | `false` | `true` |
|
||||
| `max-warnings` | <p>Maximum number of warnings allowed</p> | `false` | `0` |
|
||||
| `fail-on-error` | <p>Fail workflow if issues are found</p> | `false` | `true` |
|
||||
| `report-format` | <p>Output format (stylish, json, sarif)</p> | `false` | `sarif` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts</p> | `false` | `3` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| --------------- | -------------------------------- |
|
||||
| `error-count` | <p>Number of errors found</p> |
|
||||
| `warning-count` | <p>Number of warnings found</p> |
|
||||
| `sarif-file` | <p>Path to SARIF report file</p> |
|
||||
| `files-checked` | <p>Number of files checked</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/eslint-check@main
|
||||
with:
|
||||
working-directory:
|
||||
# Directory containing files to lint
|
||||
#
|
||||
# Required: false
|
||||
# Default: .
|
||||
|
||||
eslint-version:
|
||||
# ESLint version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: latest
|
||||
|
||||
config-file:
|
||||
# Path to ESLint config file
|
||||
#
|
||||
# Required: false
|
||||
# Default: .eslintrc
|
||||
|
||||
ignore-file:
|
||||
# Path to ESLint ignore file
|
||||
#
|
||||
# Required: false
|
||||
# Default: .eslintignore
|
||||
|
||||
file-extensions:
|
||||
# File extensions to lint (comma-separated)
|
||||
#
|
||||
# Required: false
|
||||
# Default: .js,.jsx,.ts,.tsx
|
||||
|
||||
cache:
|
||||
# Enable ESLint caching
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
max-warnings:
|
||||
# Maximum number of warnings allowed
|
||||
#
|
||||
# Required: false
|
||||
# Default: 0
|
||||
|
||||
fail-on-error:
|
||||
# Fail workflow if issues are found
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
report-format:
|
||||
# Output format (stylish, json, sarif)
|
||||
#
|
||||
# Required: false
|
||||
# Default: sarif
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
```
|
||||
261
eslint-check/action.yml
Normal file
261
eslint-check/action.yml
Normal file
@@ -0,0 +1,261 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: ESLint Check
|
||||
description: 'Run ESLint check on the repository with advanced configuration and reporting'
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: check-circle
|
||||
color: blue
|
||||
|
||||
inputs:
|
||||
working-directory:
|
||||
description: 'Directory containing files to lint'
|
||||
required: false
|
||||
default: '.'
|
||||
eslint-version:
|
||||
description: 'ESLint version to use'
|
||||
required: false
|
||||
default: 'latest'
|
||||
config-file:
|
||||
description: 'Path to ESLint config file'
|
||||
required: false
|
||||
default: '.eslintrc'
|
||||
ignore-file:
|
||||
description: 'Path to ESLint ignore file'
|
||||
required: false
|
||||
default: '.eslintignore'
|
||||
file-extensions:
|
||||
description: 'File extensions to lint (comma-separated)'
|
||||
required: false
|
||||
default: '.js,.jsx,.ts,.tsx'
|
||||
cache:
|
||||
description: 'Enable ESLint caching'
|
||||
required: false
|
||||
default: 'true'
|
||||
max-warnings:
|
||||
description: 'Maximum number of warnings allowed'
|
||||
required: false
|
||||
default: '0'
|
||||
fail-on-error:
|
||||
description: 'Fail workflow if issues are found'
|
||||
required: false
|
||||
default: 'true'
|
||||
report-format:
|
||||
description: 'Output format (stylish, json, sarif)'
|
||||
required: false
|
||||
default: 'sarif'
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts'
|
||||
required: false
|
||||
default: '3'
|
||||
|
||||
outputs:
|
||||
error-count:
|
||||
description: 'Number of errors found'
|
||||
value: ${{ steps.lint.outputs.error_count }}
|
||||
warning-count:
|
||||
description: 'Number of warnings found'
|
||||
value: ${{ steps.lint.outputs.warning_count }}
|
||||
sarif-file:
|
||||
description: 'Path to SARIF report file'
|
||||
value: ${{ steps.lint.outputs.sarif_file }}
|
||||
files-checked:
|
||||
description: 'Number of files checked'
|
||||
value: ${{ steps.lint.outputs.files_checked }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate working directory
|
||||
if [ ! -d "${{ inputs.working-directory }}" ]; then
|
||||
echo "::error::Working directory does not exist: ${{ inputs.working-directory }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate file extensions
|
||||
if ! [[ "${{ inputs.file-extensions }}" =~ ^[.,a-zA-Z0-9]+$ ]]; then
|
||||
echo "::error::Invalid file extensions format"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate max warnings
|
||||
if ! [[ "${{ inputs.max-warnings }}" =~ ^[0-9]+$ ]]; then
|
||||
echo "::error::Invalid max warnings value"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: ivuorinen/actions/node-setup@main
|
||||
|
||||
- name: Install Dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Install ESLint and required dependencies
|
||||
echo "Installing ESLint dependencies..."
|
||||
|
||||
# Function to install with retries
|
||||
install_with_retries() {
|
||||
local attempt=1
|
||||
local max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Installation attempt $attempt of $max_attempts"
|
||||
|
||||
if npm install \
|
||||
eslint@${{ inputs.eslint-version }} \
|
||||
@typescript-eslint/parser \
|
||||
@typescript-eslint/eslint-plugin \
|
||||
eslint-plugin-import \
|
||||
eslint-config-prettier \
|
||||
typescript; then
|
||||
return 0
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Installation failed, waiting 10 seconds before retry..."
|
||||
sleep 10
|
||||
fi
|
||||
done
|
||||
|
||||
echo "::error::Failed to install dependencies after $max_attempts attempts"
|
||||
return 1
|
||||
}
|
||||
|
||||
install_with_retries
|
||||
|
||||
- name: Prepare ESLint Configuration
|
||||
id: config
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create default config if none exists
|
||||
if [ ! -f "${{ inputs.config-file }}" ]; then
|
||||
echo "Creating default ESLint configuration..."
|
||||
cat > "${{ inputs.config-file }}" <<EOF
|
||||
{
|
||||
"root": true,
|
||||
"extends": [
|
||||
"eslint:recommended",
|
||||
"plugin:@typescript-eslint/recommended",
|
||||
"plugin:import/errors",
|
||||
"plugin:import/warnings",
|
||||
"plugin:import/typescript",
|
||||
"prettier"
|
||||
],
|
||||
"parser": "@typescript-eslint/parser",
|
||||
"parserOptions": {
|
||||
"ecmaVersion": 2022,
|
||||
"sourceType": "module"
|
||||
},
|
||||
"plugins": ["@typescript-eslint", "import"],
|
||||
"env": {
|
||||
"es2022": true,
|
||||
"node": true
|
||||
}
|
||||
}
|
||||
EOF
|
||||
fi
|
||||
|
||||
# Create default ignore file if none exists
|
||||
if [ ! -f "${{ inputs.ignore-file }}" ]; then
|
||||
echo "Creating default ESLint ignore file..."
|
||||
cat > "${{ inputs.ignore-file }}" <<EOF
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
coverage/
|
||||
*.min.js
|
||||
EOF
|
||||
fi
|
||||
|
||||
- name: Run ESLint Check
|
||||
id: lint
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create reports directory
|
||||
mkdir -p reports
|
||||
|
||||
# Prepare file extensions for ESLint
|
||||
IFS=',' read -ra EXTENSIONS <<< "${{ inputs.file-extensions }}"
|
||||
ext_pattern=""
|
||||
for ext in "${EXTENSIONS[@]}"; do
|
||||
ext_pattern="$ext_pattern --ext $ext"
|
||||
done
|
||||
|
||||
# Run ESLint
|
||||
echo "Running ESLint..."
|
||||
npx eslint \
|
||||
$ext_pattern \
|
||||
--config ${{ inputs.config-file }} \
|
||||
--ignore-path ${{ inputs.ignore-file }} \
|
||||
${{ inputs.cache == 'true' && '--cache' || '' }} \
|
||||
--max-warnings ${{ inputs.max-warnings }} \
|
||||
--format=${{ inputs.report-format }} \
|
||||
--output-file=reports/eslint.${{ inputs.report-format }} \
|
||||
. || {
|
||||
error_code=$?
|
||||
|
||||
# Count errors and warnings
|
||||
if [ "${{ inputs.report-format }}" = "json" ]; then
|
||||
error_count=$(jq '[.[] | .errorCount] | add' reports/eslint.json)
|
||||
warning_count=$(jq '[.[] | .warningCount] | add' reports/eslint.json)
|
||||
else
|
||||
error_count=$(grep -c '"level": "error"' reports/eslint.sarif || echo 0)
|
||||
warning_count=$(grep -c '"level": "warning"' reports/eslint.sarif || echo 0)
|
||||
fi
|
||||
|
||||
echo "error_count=${error_count}" >> $GITHUB_OUTPUT
|
||||
echo "warning_count=${warning_count}" >> $GITHUB_OUTPUT
|
||||
|
||||
if [ "${{ inputs.fail-on-error }}" = "true" ] && [ $error_code -ne 0 ]; then
|
||||
echo "::error::ESLint found ${error_count} errors and ${warning_count} warnings"
|
||||
exit $error_code
|
||||
fi
|
||||
}
|
||||
|
||||
# Count checked files
|
||||
files_checked=$(find . -type f \( $(printf -- "-name *%s -o " "${EXTENSIONS[@]}") -false \) | wc -l)
|
||||
echo "files_checked=${files_checked}" >> $GITHUB_OUTPUT
|
||||
echo "sarif_file=reports/eslint.sarif" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Upload ESLint Results
|
||||
if: always() && inputs.report-format == 'sarif'
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: ${{ inputs.working-directory }}/reports/eslint.sarif
|
||||
category: eslint
|
||||
|
||||
- name: Cache Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Clean up ESLint cache if it exists
|
||||
if [ -f ".eslintcache" ]; then
|
||||
rm .eslintcache
|
||||
fi
|
||||
|
||||
# Remove temporary files
|
||||
rm -rf reports/
|
||||
17
eslint-fix/README.md
Normal file
17
eslint-fix/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/eslint-fix
|
||||
|
||||
## ESLint Fix
|
||||
|
||||
### Description
|
||||
|
||||
Fixes ESLint violations in a project.
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/eslint-fix@main
|
||||
```
|
||||
38
eslint-fix/action.yml
Normal file
38
eslint-fix/action.yml
Normal file
@@ -0,0 +1,38 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: ESLint Fix
|
||||
description: Fixes ESLint violations in a project.
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'code'
|
||||
color: 'blue'
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set Git Config
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Node Setup
|
||||
uses: ivuorinen/actions/node-setup@main
|
||||
|
||||
- name: Install Dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
npm install
|
||||
|
||||
- name: Run ESLint Fix
|
||||
shell: bash
|
||||
run: |
|
||||
npx eslint . --fix
|
||||
|
||||
- name: Push Fixes
|
||||
if: always()
|
||||
uses: stefanzweifel/git-auto-commit-action@v5
|
||||
with:
|
||||
commit_message: 'style: autofix ESLint violations'
|
||||
add_options: '-u'
|
||||
36
github-release/README.md
Normal file
36
github-release/README.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# ivuorinen/actions/github-release
|
||||
|
||||
## GitHub Release
|
||||
|
||||
### Description
|
||||
|
||||
Creates a GitHub release with a version and changelog.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ----------- | ---------------------------------------------------- | -------- | ------- |
|
||||
| `version` | <p>The version for the release.</p> | `true` | `""` |
|
||||
| `changelog` | <p>The changelog or description for the release.</p> | `false` | `""` |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/github-release@main
|
||||
with:
|
||||
version:
|
||||
# The version for the release.
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
changelog:
|
||||
# The changelog or description for the release.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
```
|
||||
54
github-release/action.yml
Normal file
54
github-release/action.yml
Normal file
@@ -0,0 +1,54 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: GitHub Release
|
||||
description: 'Creates a GitHub release with a version and changelog.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'tag'
|
||||
color: 'blue'
|
||||
|
||||
inputs:
|
||||
version:
|
||||
description: 'The version for the release.'
|
||||
required: true
|
||||
changelog:
|
||||
description: 'The changelog or description for the release.'
|
||||
required: false
|
||||
default: ''
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Create GitHub Release with Autogenerated Changelog
|
||||
if: ${{ inputs.changelog == '' }}
|
||||
shell: bash
|
||||
run: |
|
||||
# Validate version format
|
||||
if [[ ! "${{ inputs.version }}" =~ ^v?[0-9]+\.[0-9]+\.[0-9]+(-[\w.]+)?(\+[\w.]+)?$ ]]; then
|
||||
echo "Error: Invalid version format. Must follow semantic versioning."
|
||||
exit 1
|
||||
fi
|
||||
# Escape special characters in inputs
|
||||
VERSION=$(echo "${{ inputs.version }}" | sed 's/[&/\]/\\&/g')
|
||||
gh release create ${{ inputs.version }}
|
||||
--repo="${GITHUB_REPOSITORY}" \
|
||||
--title="${{ inputs.version }}" \
|
||||
--generate-notes
|
||||
|
||||
- name: Create GitHub Release with Custom Changelog
|
||||
if: ${{ inputs.changelog != '' }}
|
||||
shell: bash
|
||||
run: |
|
||||
# Validate version format
|
||||
if [[ ! "${{ inputs.version }}" =~ ^v?[0-9]+\.[0-9]+\.[0-9]+(-[\w.]+)?(\+[\w.]+)?$ ]]; then
|
||||
echo "Error: Invalid version format. Must follow semantic versioning."
|
||||
exit 1
|
||||
fi
|
||||
# Escape special characters in inputs
|
||||
VERSION=$(echo "${{ inputs.version }}" | sed 's/[&/\]/\\&/g')
|
||||
CHANGELOG=$(echo "${{ inputs.changelog }}" | sed 's/[&/\]/\\&/g')
|
||||
gh release create ${{ inputs.version }}
|
||||
--repo="${GITHUB_REPOSITORY}" \
|
||||
--title="${{ inputs.version }}" \
|
||||
--notes="${{ inputs.changelog }}"
|
||||
36
go-build/README.md
Normal file
36
go-build/README.md
Normal file
@@ -0,0 +1,36 @@
|
||||
# ivuorinen/actions/go-build
|
||||
|
||||
## Go Build
|
||||
|
||||
### Description
|
||||
|
||||
Builds the Go project.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------- | ----------------------------------- | -------- | ------- |
|
||||
| `go-version` | <p>Go version to use.</p> | `false` | `""` |
|
||||
| `destination` | <p>Build destination directory.</p> | `false` | `./bin` |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/go-build@main
|
||||
with:
|
||||
go-version:
|
||||
# Go version to use.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
destination:
|
||||
# Build destination directory.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ./bin
|
||||
```
|
||||
34
go-build/action.yml
Normal file
34
go-build/action.yml
Normal file
@@ -0,0 +1,34 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Go Build
|
||||
description: 'Builds the Go project.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: package
|
||||
color: blue
|
||||
|
||||
inputs:
|
||||
go-version:
|
||||
description: 'Go version to use.'
|
||||
required: false
|
||||
destination:
|
||||
description: 'Build destination directory.'
|
||||
required: false
|
||||
default: './bin'
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect Go Version
|
||||
uses: ivuorinen/actions/go-version-detect@main
|
||||
|
||||
- name: Setup Go
|
||||
uses: actions/setup-go@v4
|
||||
with:
|
||||
go-version: '${{ steps.detect-go-version.outputs.go-version }}'
|
||||
|
||||
- name: Build Go Project
|
||||
shell: bash
|
||||
run: |
|
||||
go build -o ${{ inputs.destination }} ./...
|
||||
122
go-lint/README.md
Normal file
122
go-lint/README.md
Normal file
@@ -0,0 +1,122 @@
|
||||
# ivuorinen/actions/go-lint
|
||||
|
||||
## Go Lint Check
|
||||
|
||||
### Description
|
||||
|
||||
Run golangci-lint with advanced configuration, caching, and reporting
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ----------------------- | ---------------------------------------------------- | -------- | --------------- |
|
||||
| `working-directory` | <p>Directory containing Go files</p> | `false` | `.` |
|
||||
| `golangci-lint-version` | <p>Version of golangci-lint to use</p> | `false` | `latest` |
|
||||
| `go-version` | <p>Go version to use</p> | `false` | `stable` |
|
||||
| `config-file` | <p>Path to golangci-lint config file</p> | `false` | `.golangci.yml` |
|
||||
| `timeout` | <p>Timeout for analysis (e.g., 5m, 1h)</p> | `false` | `5m` |
|
||||
| `cache` | <p>Enable golangci-lint caching</p> | `false` | `true` |
|
||||
| `fail-on-error` | <p>Fail workflow if issues are found</p> | `false` | `true` |
|
||||
| `report-format` | <p>Output format (json, sarif, github-actions)</p> | `false` | `sarif` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts</p> | `false` | `3` |
|
||||
| `only-new-issues` | <p>Report only new issues since main branch</p> | `false` | `true` |
|
||||
| `disable-all` | <p>Disable all linters (useful with --enable-\*)</p> | `false` | `false` |
|
||||
| `enable-linters` | <p>Comma-separated list of linters to enable</p> | `false` | `""` |
|
||||
| `disable-linters` | <p>Comma-separated list of linters to disable</p> | `false` | `""` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ---------------- | ----------------------------------------- |
|
||||
| `error-count` | <p>Number of errors found</p> |
|
||||
| `sarif-file` | <p>Path to SARIF report file</p> |
|
||||
| `cache-hit` | <p>Indicates if there was a cache hit</p> |
|
||||
| `analyzed-files` | <p>Number of files analyzed</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/go-lint@main
|
||||
with:
|
||||
working-directory:
|
||||
# Directory containing Go files
|
||||
#
|
||||
# Required: false
|
||||
# Default: .
|
||||
|
||||
golangci-lint-version:
|
||||
# Version of golangci-lint to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: latest
|
||||
|
||||
go-version:
|
||||
# Go version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: stable
|
||||
|
||||
config-file:
|
||||
# Path to golangci-lint config file
|
||||
#
|
||||
# Required: false
|
||||
# Default: .golangci.yml
|
||||
|
||||
timeout:
|
||||
# Timeout for analysis (e.g., 5m, 1h)
|
||||
#
|
||||
# Required: false
|
||||
# Default: 5m
|
||||
|
||||
cache:
|
||||
# Enable golangci-lint caching
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
fail-on-error:
|
||||
# Fail workflow if issues are found
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
report-format:
|
||||
# Output format (json, sarif, github-actions)
|
||||
#
|
||||
# Required: false
|
||||
# Default: sarif
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
|
||||
only-new-issues:
|
||||
# Report only new issues since main branch
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
disable-all:
|
||||
# Disable all linters (useful with --enable-*)
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
|
||||
enable-linters:
|
||||
# Comma-separated list of linters to enable
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
disable-linters:
|
||||
# Comma-separated list of linters to disable
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
```
|
||||
288
go-lint/action.yml
Normal file
288
go-lint/action.yml
Normal file
@@ -0,0 +1,288 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Go Lint Check
|
||||
description: 'Run golangci-lint with advanced configuration, caching, and reporting'
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: code
|
||||
color: blue
|
||||
|
||||
inputs:
|
||||
working-directory:
|
||||
description: 'Directory containing Go files'
|
||||
required: false
|
||||
default: '.'
|
||||
golangci-lint-version:
|
||||
description: 'Version of golangci-lint to use'
|
||||
required: false
|
||||
default: 'latest'
|
||||
go-version:
|
||||
description: 'Go version to use'
|
||||
required: false
|
||||
default: 'stable'
|
||||
config-file:
|
||||
description: 'Path to golangci-lint config file'
|
||||
required: false
|
||||
default: '.golangci.yml'
|
||||
timeout:
|
||||
description: 'Timeout for analysis (e.g., 5m, 1h)'
|
||||
required: false
|
||||
default: '5m'
|
||||
cache:
|
||||
description: 'Enable golangci-lint caching'
|
||||
required: false
|
||||
default: 'true'
|
||||
fail-on-error:
|
||||
description: 'Fail workflow if issues are found'
|
||||
required: false
|
||||
default: 'true'
|
||||
report-format:
|
||||
description: 'Output format (json, sarif, github-actions)'
|
||||
required: false
|
||||
default: 'sarif'
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts'
|
||||
required: false
|
||||
default: '3'
|
||||
only-new-issues:
|
||||
description: 'Report only new issues since main branch'
|
||||
required: false
|
||||
default: 'true'
|
||||
disable-all:
|
||||
description: 'Disable all linters (useful with --enable-*)'
|
||||
required: false
|
||||
default: 'false'
|
||||
enable-linters:
|
||||
description: 'Comma-separated list of linters to enable'
|
||||
required: false
|
||||
disable-linters:
|
||||
description: 'Comma-separated list of linters to disable'
|
||||
required: false
|
||||
|
||||
outputs:
|
||||
error-count:
|
||||
description: 'Number of errors found'
|
||||
value: ${{ steps.lint.outputs.error_count }}
|
||||
sarif-file:
|
||||
description: 'Path to SARIF report file'
|
||||
value: ${{ steps.lint.outputs.sarif_file }}
|
||||
cache-hit:
|
||||
description: 'Indicates if there was a cache hit'
|
||||
value: ${{ steps.cache.outputs.cache-hit }}
|
||||
analyzed-files:
|
||||
description: 'Number of files analyzed'
|
||||
value: ${{ steps.lint.outputs.analyzed_files }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate working directory
|
||||
if [ ! -d "${{ inputs.working-directory }}" ]; then
|
||||
echo "::error::Working directory does not exist: ${{ inputs.working-directory }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate timeout format
|
||||
if ! echo "${{ inputs.timeout }}" | grep -qE '^[0-9]+(h|m|s)$'; then
|
||||
echo "::error::Invalid timeout format. Expected format: 5m, 1h, etc."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate linter lists if provided
|
||||
for linter_list in "${{ inputs.enable-linters }}" "${{ inputs.disable-linters }}"; do
|
||||
if [ -n "$linter_list" ]; then
|
||||
if ! echo "$linter_list" | grep -qE '^[a-zA-Z0-9,-]+$'; then
|
||||
echo "::error::Invalid linter list format"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Setup Go
|
||||
uses: actions/setup-go@v5
|
||||
with:
|
||||
go-version: ${{ inputs.go-version }}
|
||||
cache: true
|
||||
|
||||
- name: Set up Cache
|
||||
id: cache
|
||||
if: inputs.cache == 'true'
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
~/.cache/golangci-lint
|
||||
~/.cache/go-build
|
||||
key: ${{ runner.os }}-golangci-${{ inputs.golangci-lint-version }}-${{ hashFiles('**/go.sum') }}-${{ hashFiles('${{ inputs.config-file }}') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-golangci-${{ inputs.golangci-lint-version }}-
|
||||
|
||||
- name: Install golangci-lint
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Function to install golangci-lint with retries
|
||||
install_golangci_lint() {
|
||||
local attempt=1
|
||||
local max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Installation attempt $attempt of $max_attempts"
|
||||
|
||||
if curl -sSfL https://raw.githubusercontent.com/golangci/golangci-lint/master/install.sh | \
|
||||
sh -s -- -b "$(go env GOPATH)/bin" \
|
||||
${{ inputs.golangci-lint-version != 'latest' && 'v'}}${{ inputs.golangci-lint-version }}; then
|
||||
return 0
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Installation failed, waiting 10 seconds before retry..."
|
||||
sleep 10
|
||||
fi
|
||||
done
|
||||
|
||||
echo "::error::Failed to install golangci-lint after $max_attempts attempts"
|
||||
return 1
|
||||
}
|
||||
|
||||
install_golangci_lint
|
||||
|
||||
- name: Prepare Configuration
|
||||
id: config
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create default config if none exists
|
||||
if [ ! -f "${{ inputs.config-file }}" ]; then
|
||||
echo "Creating default golangci-lint configuration..."
|
||||
cat > "${{ inputs.config-file }}" <<EOF
|
||||
linters:
|
||||
enable-all: true
|
||||
disable:
|
||||
- exhaustivestruct
|
||||
- interfacer
|
||||
- scopelint
|
||||
- maligned
|
||||
|
||||
linters-settings:
|
||||
govet:
|
||||
check-shadowing: true
|
||||
golint:
|
||||
min-confidence: 0.8
|
||||
gocyclo:
|
||||
min-complexity: 15
|
||||
dupl:
|
||||
threshold: 100
|
||||
goconst:
|
||||
min-len: 3
|
||||
min-occurrences: 3
|
||||
|
||||
issues:
|
||||
exclude-use-default: false
|
||||
max-issues-per-linter: 0
|
||||
max-same-issues: 0
|
||||
new: true
|
||||
|
||||
run:
|
||||
deadline: ${{ inputs.timeout }}
|
||||
tests: true
|
||||
skip-dirs:
|
||||
- vendor
|
||||
- third_party
|
||||
EOF
|
||||
fi
|
||||
|
||||
- name: Run golangci-lint
|
||||
id: lint
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create reports directory
|
||||
mkdir -p reports
|
||||
|
||||
# Prepare linter configuration
|
||||
linter_args=""
|
||||
if [ "${{ inputs.disable-all }}" = "true" ]; then
|
||||
linter_args="--disable-all"
|
||||
fi
|
||||
|
||||
if [ -n "${{ inputs.enable-linters }}" ]; then
|
||||
linter_args="$linter_args --enable=${{ inputs.enable-linters }}"
|
||||
fi
|
||||
|
||||
if [ -n "${{ inputs.disable-linters }}" ]; then
|
||||
linter_args="$linter_args --disable=${{ inputs.disable-linters }}"
|
||||
fi
|
||||
|
||||
# Run golangci-lint
|
||||
echo "Running golangci-lint..."
|
||||
|
||||
result_file="reports/golangci-lint.${{ inputs.report-format }}"
|
||||
|
||||
GOLANGCI_LINT_CACHE="$HOME/.cache/golangci-lint" \
|
||||
golangci-lint run \
|
||||
--config "${{ inputs.config-file }}" \
|
||||
--timeout "${{ inputs.timeout }}" \
|
||||
${{ inputs.cache == 'true' && '--cache' || '--no-cache' }} \
|
||||
${{ inputs.only-new-issues == 'true' && '--new' || '' }} \
|
||||
--out-format "${{ inputs.report-format }}" \
|
||||
$linter_args \
|
||||
./... > "$result_file" || {
|
||||
exit_code=$?
|
||||
|
||||
# Count errors
|
||||
if [ "${{ inputs.report-format }}" = "json" ]; then
|
||||
error_count=$(jq '.Issues | length' "$result_file")
|
||||
else
|
||||
error_count=$(grep -c "level\": \"error\"" "$result_file" || echo 0)
|
||||
fi
|
||||
|
||||
echo "error_count=${error_count}" >> $GITHUB_OUTPUT
|
||||
|
||||
if [ "${{ inputs.fail-on-error }}" = "true" ]; then
|
||||
echo "::error::golangci-lint found ${error_count} issues"
|
||||
exit $exit_code
|
||||
fi
|
||||
}
|
||||
|
||||
# Count analyzed files
|
||||
analyzed_files=$(find . -type f -name "*.go" -not -path "./vendor/*" -not -path "./.git/*" | wc -l)
|
||||
echo "analyzed_files=${analyzed_files}" >> $GITHUB_OUTPUT
|
||||
echo "sarif_file=$result_file" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Upload Lint Results
|
||||
if: always() && inputs.report-format == 'sarif'
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: ${{ inputs.working-directory }}/reports/golangci-lint.sarif
|
||||
category: golangci-lint
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Remove temporary files
|
||||
rm -rf reports/
|
||||
|
||||
# Clean cache if not being preserved
|
||||
if [ "${{ inputs.cache }}" != "true" ]; then
|
||||
rm -rf ~/.cache/golangci-lint
|
||||
fi
|
||||
35
go-version-detect/README.md
Normal file
35
go-version-detect/README.md
Normal file
@@ -0,0 +1,35 @@
|
||||
# ivuorinen/actions/go-version-detect
|
||||
|
||||
## Go Version Detect
|
||||
|
||||
### Description
|
||||
|
||||
Detects the Go version from the project's go.mod file or defaults to a specified version.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ----------------- | -------------------------------------------------------- | -------- | ------- |
|
||||
| `default-version` | <p>Default Go version to use if go.mod is not found.</p> | `false` | `1.22` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------ | -------------------------------------- |
|
||||
| `go-version` | <p>Detected or default Go version.</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/go-version-detect@main
|
||||
with:
|
||||
default-version:
|
||||
# Default Go version to use if go.mod is not found.
|
||||
#
|
||||
# Required: false
|
||||
# Default: 1.22
|
||||
```
|
||||
36
go-version-detect/action.yml
Normal file
36
go-version-detect/action.yml
Normal file
@@ -0,0 +1,36 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Go Version Detect
|
||||
description: "Detects the Go version from the project's go.mod file or defaults to a specified version."
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: code
|
||||
color: blue
|
||||
|
||||
inputs:
|
||||
default-version:
|
||||
description: 'Default Go version to use if go.mod is not found.'
|
||||
required: false
|
||||
default: '1.22'
|
||||
|
||||
outputs:
|
||||
go-version:
|
||||
description: 'Detected or default Go version.'
|
||||
value: ${{ steps.detect-go-version.outputs.go-version }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Detect Go Version
|
||||
id: detect-go-version
|
||||
shell: bash
|
||||
run: |
|
||||
if [ -f go.mod ]; then
|
||||
version=$(grep '^go ' go.mod | awk '{print $2}')
|
||||
echo "Detected Go version: $version"
|
||||
echo "go-version=$version" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "No go.mod found. Using default Go version."
|
||||
echo "go-version=${{ inputs.default-version }}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
87
node-setup/README.md
Normal file
87
node-setup/README.md
Normal file
@@ -0,0 +1,87 @@
|
||||
# ivuorinen/actions/node-setup
|
||||
|
||||
## Node Setup
|
||||
|
||||
### Description
|
||||
|
||||
Sets up Node.js environment with advanced version management, caching, and tooling.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ----------------- | ------------------------------------------------------------------------ | -------- | ---------------------------- |
|
||||
| `default-version` | <p>Default Node.js version to use if no configuration file is found.</p> | `false` | `22` |
|
||||
| `package-manager` | <p>Node.js package manager to use (npm, yarn, pnpm)</p> | `false` | `npm` |
|
||||
| `registry-url` | <p>Custom NPM registry URL</p> | `false` | `https://registry.npmjs.org` |
|
||||
| `token` | <p>Auth token for private registry</p> | `false` | `""` |
|
||||
| `cache` | <p>Enable dependency caching</p> | `false` | `true` |
|
||||
| `install` | <p>Automatically install dependencies</p> | `false` | `true` |
|
||||
| `node-mirror` | <p>Custom Node.js binary mirror</p> | `false` | `""` |
|
||||
| `force-version` | <p>Force specific Node.js version regardless of config files</p> | `false` | `""` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ----------------- | ----------------------------------------- |
|
||||
| `node-version` | <p>Installed Node.js version</p> |
|
||||
| `package-manager` | <p>Selected package manager</p> |
|
||||
| `cache-hit` | <p>Indicates if there was a cache hit</p> |
|
||||
| `node-path` | <p>Path to Node.js installation</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/node-setup@main
|
||||
with:
|
||||
default-version:
|
||||
# Default Node.js version to use if no configuration file is found.
|
||||
#
|
||||
# Required: false
|
||||
# Default: 22
|
||||
|
||||
package-manager:
|
||||
# Node.js package manager to use (npm, yarn, pnpm)
|
||||
#
|
||||
# Required: false
|
||||
# Default: npm
|
||||
|
||||
registry-url:
|
||||
# Custom NPM registry URL
|
||||
#
|
||||
# Required: false
|
||||
# Default: https://registry.npmjs.org
|
||||
|
||||
token:
|
||||
# Auth token for private registry
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
cache:
|
||||
# Enable dependency caching
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
install:
|
||||
# Automatically install dependencies
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
node-mirror:
|
||||
# Custom Node.js binary mirror
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
force-version:
|
||||
# Force specific Node.js version regardless of config files
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
```
|
||||
292
node-setup/action.yml
Normal file
292
node-setup/action.yml
Normal file
@@ -0,0 +1,292 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Node Setup
|
||||
description: 'Sets up Node.js environment with advanced version management, caching, and tooling.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: server
|
||||
color: green
|
||||
|
||||
inputs:
|
||||
default-version:
|
||||
description: 'Default Node.js version to use if no configuration file is found.'
|
||||
required: false
|
||||
default: '22'
|
||||
package-manager:
|
||||
description: 'Node.js package manager to use (npm, yarn, pnpm)'
|
||||
required: false
|
||||
default: 'npm'
|
||||
registry-url:
|
||||
description: 'Custom NPM registry URL'
|
||||
required: false
|
||||
default: 'https://registry.npmjs.org'
|
||||
token:
|
||||
description: 'Auth token for private registry'
|
||||
required: false
|
||||
cache:
|
||||
description: 'Enable dependency caching'
|
||||
required: false
|
||||
default: 'true'
|
||||
install:
|
||||
description: 'Automatically install dependencies'
|
||||
required: false
|
||||
default: 'true'
|
||||
node-mirror:
|
||||
description: 'Custom Node.js binary mirror'
|
||||
required: false
|
||||
force-version:
|
||||
description: 'Force specific Node.js version regardless of config files'
|
||||
required: false
|
||||
|
||||
outputs:
|
||||
node-version:
|
||||
description: 'Installed Node.js version'
|
||||
value: ${{ steps.setup.outputs.node-version }}
|
||||
package-manager:
|
||||
description: 'Selected package manager'
|
||||
value: ${{ steps.setup.outputs.package-manager }}
|
||||
cache-hit:
|
||||
description: 'Indicates if there was a cache hit'
|
||||
value: ${{ steps.deps-cache.outputs.cache-hit }}
|
||||
node-path:
|
||||
description: 'Path to Node.js installation'
|
||||
value: ${{ steps.setup.outputs.node-path }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Version Detection
|
||||
id: version
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Function to validate Node.js version format
|
||||
validate_version() {
|
||||
local version=$1
|
||||
if ! [[ $version =~ ^[0-9]+(\.[0-9]+)*$ ]]; then
|
||||
echo "::error::Invalid Node.js version format: $version"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Function to get version from .nvmrc
|
||||
get_nvmrc_version() {
|
||||
if [ -f .nvmrc ]; then
|
||||
local version
|
||||
version=$(cat .nvmrc | tr -d 'v' | tr -d ' ' | tr -d '\n')
|
||||
if validate_version "$version"; then
|
||||
echo "$version"
|
||||
return 0
|
||||
fi
|
||||
fi
|
||||
return 1
|
||||
}
|
||||
|
||||
# Function to get version from .tool-versions
|
||||
get_tool_versions_version() {
|
||||
if [ -f .tool-versions ]; then
|
||||
local version
|
||||
version=$(grep -E '^nodejs[[:space:]]' .tool-versions |
|
||||
sed 's/#.*//' |
|
||||
awk '{print $2}' |
|
||||
tr -d ' ' |
|
||||
tr -d '\n')
|
||||
if [ -n "$version" ] && validate_version "$version"; then
|
||||
echo "$version"
|
||||
return 0
|
||||
fi
|
||||
fi
|
||||
return 1
|
||||
}
|
||||
|
||||
# Function to get version from package.json
|
||||
get_package_json_version() {
|
||||
if [ -f package.json ]; then
|
||||
local version
|
||||
version=$(node -pe "try { require('./package.json').engines.node.replace(/[^0-9.]/g, '') } catch(e) { '' }")
|
||||
if [ -n "$version" ] && validate_version "$version"; then
|
||||
echo "$version"
|
||||
return 0
|
||||
fi
|
||||
fi
|
||||
return 1
|
||||
}
|
||||
|
||||
# Determine Node.js version
|
||||
if [ -n "${{ inputs.force-version }}" ]; then
|
||||
if ! validate_version "${{ inputs.force-version }}"; then
|
||||
exit 1
|
||||
fi
|
||||
version="${{ inputs.force-version }}"
|
||||
echo "Using forced Node.js version: $version"
|
||||
else
|
||||
version=$(get_nvmrc_version ||
|
||||
get_tool_versions_version ||
|
||||
get_package_json_version ||
|
||||
echo "${{ inputs.default-version }}")
|
||||
echo "Detected Node.js version: $version"
|
||||
fi
|
||||
|
||||
echo "version=$version" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Package Manager Detection
|
||||
id: pkg-manager
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate input package manager
|
||||
case "${{ inputs.package-manager }}" in
|
||||
npm|yarn|pnpm)
|
||||
pkg_manager="${{ inputs.package-manager }}"
|
||||
;;
|
||||
*)
|
||||
echo "::error::Invalid package manager specified: ${{ inputs.package-manager }}"
|
||||
exit 1
|
||||
;;
|
||||
esac
|
||||
|
||||
# Auto-detect if files exist
|
||||
if [ -f "yarn.lock" ]; then
|
||||
pkg_manager="yarn"
|
||||
elif [ -f "pnpm-lock.yaml" ]; then
|
||||
pkg_manager="pnpm"
|
||||
elif [ -f "package-lock.json" ]; then
|
||||
pkg_manager="npm"
|
||||
fi
|
||||
|
||||
echo "manager=$pkg_manager" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Setup Node.js
|
||||
id: setup
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: ${{ steps.version.outputs.version }}
|
||||
registry-url: ${{ inputs.registry-url }}
|
||||
cache: ${{ steps.pkg-manager.outputs.manager }}
|
||||
node-version-file: ''
|
||||
always-auth: ${{ inputs.token != '' }}
|
||||
cache-dependency-path: |
|
||||
**/package-lock.json
|
||||
**/yarn.lock
|
||||
**/pnpm-lock.yaml
|
||||
|
||||
- name: Configure Package Manager
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Configure package manager
|
||||
case "${{ steps.pkg-manager.outputs.manager }}" in
|
||||
yarn)
|
||||
if ! command -v yarn &> /dev/null; then
|
||||
echo "Installing Yarn..."
|
||||
npm install -g yarn
|
||||
fi
|
||||
# Configure Yarn settings
|
||||
yarn config set nodeLinker node-modules
|
||||
yarn config set checksumBehavior ignore
|
||||
;;
|
||||
pnpm)
|
||||
if ! command -v pnpm &> /dev/null; then
|
||||
echo "Installing pnpm..."
|
||||
npm install -g pnpm
|
||||
fi
|
||||
;;
|
||||
esac
|
||||
|
||||
# Configure registry authentication if token provided
|
||||
if [ -n "${{ inputs.token }}" ]; then
|
||||
echo "Configuring registry authentication..."
|
||||
case "${{ steps.pkg-manager.outputs.manager }}" in
|
||||
npm)
|
||||
npm config set //${{ inputs.registry-url }}/:_authToken ${{ inputs.token }}
|
||||
;;
|
||||
yarn)
|
||||
yarn config set npmAuthToken ${{ inputs.token }}
|
||||
;;
|
||||
pnpm)
|
||||
pnpm config set //registry.npmjs.org/:_authToken ${{ inputs.token }}
|
||||
;;
|
||||
esac
|
||||
fi
|
||||
|
||||
- name: Setup Caching
|
||||
if: inputs.cache == 'true'
|
||||
id: deps-cache
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: |
|
||||
**/node_modules
|
||||
~/.npm
|
||||
~/.pnpm-store
|
||||
~/.yarn/cache
|
||||
key: ${{ runner.os }}-node-${{ steps.version.outputs.version }}-${{ steps.pkg-manager.outputs.manager }}-${{ hashFiles('**/package-lock.json', '**/yarn.lock', '**/pnpm-lock.yaml') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-node-${{ steps.version.outputs.version }}-${{ steps.pkg-manager.outputs.manager }}-
|
||||
|
||||
- name: Install Dependencies
|
||||
if: inputs.install == 'true'
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
echo "Installing dependencies using ${{ steps.pkg-manager.outputs.manager }}..."
|
||||
|
||||
case "${{ steps.pkg-manager.outputs.manager }}" in
|
||||
npm)
|
||||
npm ci --prefer-offline --no-audit --no-fund
|
||||
;;
|
||||
yarn)
|
||||
yarn install --frozen-lockfile --prefer-offline --non-interactive
|
||||
;;
|
||||
pnpm)
|
||||
pnpm install --frozen-lockfile --prefer-offline
|
||||
;;
|
||||
esac
|
||||
|
||||
- name: Verify Setup
|
||||
id: verify
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify Node.js installation
|
||||
echo "Verifying Node.js installation..."
|
||||
node_version=$(node --version)
|
||||
echo "Node.js version: $node_version"
|
||||
|
||||
# Verify package manager installation
|
||||
echo "Verifying package manager installation..."
|
||||
case "${{ steps.pkg-manager.outputs.manager }}" in
|
||||
npm)
|
||||
npm --version
|
||||
;;
|
||||
yarn)
|
||||
yarn --version
|
||||
;;
|
||||
pnpm)
|
||||
pnpm --version
|
||||
;;
|
||||
esac
|
||||
|
||||
# Verify module resolution
|
||||
if [ -f "package.json" ]; then
|
||||
echo "Verifying module resolution..."
|
||||
node -e "require('./package.json')"
|
||||
fi
|
||||
|
||||
- name: Output Configuration
|
||||
id: config
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Output final configuration
|
||||
{
|
||||
echo "node-version=$(node --version)"
|
||||
echo "node-path=$(which node)"
|
||||
echo "package-manager=${{ steps.pkg-manager.outputs.manager }}"
|
||||
} >> $GITHUB_OUTPUT
|
||||
43
npm-publish/README.md
Normal file
43
npm-publish/README.md
Normal file
@@ -0,0 +1,43 @@
|
||||
# ivuorinen/actions/npm-publish
|
||||
|
||||
## Publish to NPM
|
||||
|
||||
### Description
|
||||
|
||||
Publishes the package to the NPM registry with configurable scope and registry URL.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ----------------- | ----------------------------------- | -------- | -------------------------------------- |
|
||||
| `registry-url` | <p>Registry URL for publishing.</p> | `false` | `https://registry.npmjs.org/` |
|
||||
| `scope` | <p>Package scope to use.</p> | `false` | `@ivuorinen` |
|
||||
| `package-version` | <p>The version to publish.</p> | `false` | `${{ github.event.release.tag_name }}` |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/npm-publish@main
|
||||
with:
|
||||
registry-url:
|
||||
# Registry URL for publishing.
|
||||
#
|
||||
# Required: false
|
||||
# Default: https://registry.npmjs.org/
|
||||
|
||||
scope:
|
||||
# Package scope to use.
|
||||
#
|
||||
# Required: false
|
||||
# Default: @ivuorinen
|
||||
|
||||
package-version:
|
||||
# The version to publish.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ${{ github.event.release.tag_name }}
|
||||
```
|
||||
57
npm-publish/action.yml
Normal file
57
npm-publish/action.yml
Normal file
@@ -0,0 +1,57 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Publish to NPM
|
||||
description: 'Publishes the package to the NPM registry with configurable scope and registry URL.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: package
|
||||
color: green
|
||||
|
||||
inputs:
|
||||
registry-url:
|
||||
description: 'Registry URL for publishing.'
|
||||
required: false
|
||||
default: 'https://registry.npmjs.org/'
|
||||
scope:
|
||||
description: 'Package scope to use.'
|
||||
required: false
|
||||
default: '@ivuorinen'
|
||||
package-version:
|
||||
description: 'The version to publish.'
|
||||
required: false
|
||||
default: ${{ github.event.release.tag_name }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Setup Node.js
|
||||
uses: ivuorinen/actions/node-setup@main
|
||||
|
||||
- name: Authenticate NPM
|
||||
shell: bash
|
||||
run: |
|
||||
echo "//${{ inputs.registry-url }}/:_authToken=${{ secrets.NPM_TOKEN }}" > ~/.npmrc
|
||||
|
||||
- name: Publish Package
|
||||
shell: bash
|
||||
env:
|
||||
NPM_TOKEN: ${{ secrets.NPM_TOKEN }}
|
||||
run: |
|
||||
pkg_version=$(node -p "require('./package.json').version")
|
||||
if [ "$pkg_version" != "${{ inputs.package-version }}" ]; then
|
||||
echo "Version mismatch: package.json ($pkg_version) != input (${{ inputs.package-version }})"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Dry run first
|
||||
npm publish \
|
||||
--registry ${{ inputs.registry-url }} \
|
||||
--dry-run \
|
||||
--scope ${{ inputs.scope }}
|
||||
|
||||
npm publish \
|
||||
--registry ${{ inputs.registry-url }} \
|
||||
--verbose \
|
||||
--scope ${{ inputs.scope }} \
|
||||
--tag ${{ inputs.package-version }}
|
||||
94
php-composer/README.md
Normal file
94
php-composer/README.md
Normal file
@@ -0,0 +1,94 @@
|
||||
# ivuorinen/actions/php-composer
|
||||
|
||||
## Run Composer Install
|
||||
|
||||
### Description
|
||||
|
||||
Runs Composer install on a repository with advanced caching and configuration.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------------- | ------------------------------------------------------------- | -------- | --------------------------------------------------- |
|
||||
| `php` | <p>PHP Version to use.</p> | `true` | `8.4` |
|
||||
| `extensions` | <p>Comma-separated list of PHP extensions to install</p> | `false` | `mbstring, xml, zip, curl, json` |
|
||||
| `tools` | <p>Comma-separated list of Composer tools to install</p> | `false` | `composer:v2` |
|
||||
| `args` | <p>Arguments to pass to Composer.</p> | `false` | `--no-progress --prefer-dist --optimize-autoloader` |
|
||||
| `composer-version` | <p>Composer version to use (1 or 2)</p> | `false` | `2` |
|
||||
| `stability` | <p>Minimum stability (stable, RC, beta, alpha, dev)</p> | `false` | `stable` |
|
||||
| `cache-directories` | <p>Additional directories to cache (comma-separated)</p> | `false` | `""` |
|
||||
| `token` | <p>GitHub token for private repository access</p> | `false` | `${{ github.token }}` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts for Composer commands</p> | `false` | `3` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------------ | ----------------------------------------------- |
|
||||
| `lock` | <p>composer.lock or composer.json file hash</p> |
|
||||
| `php-version` | <p>Installed PHP version</p> |
|
||||
| `composer-version` | <p>Installed Composer version</p> |
|
||||
| `cache-hit` | <p>Indicates if there was a cache hit</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/php-composer@main
|
||||
with:
|
||||
php:
|
||||
# PHP Version to use.
|
||||
#
|
||||
# Required: true
|
||||
# Default: 8.4
|
||||
|
||||
extensions:
|
||||
# Comma-separated list of PHP extensions to install
|
||||
#
|
||||
# Required: false
|
||||
# Default: mbstring, xml, zip, curl, json
|
||||
|
||||
tools:
|
||||
# Comma-separated list of Composer tools to install
|
||||
#
|
||||
# Required: false
|
||||
# Default: composer:v2
|
||||
|
||||
args:
|
||||
# Arguments to pass to Composer.
|
||||
#
|
||||
# Required: false
|
||||
# Default: --no-progress --prefer-dist --optimize-autoloader
|
||||
|
||||
composer-version:
|
||||
# Composer version to use (1 or 2)
|
||||
#
|
||||
# Required: false
|
||||
# Default: 2
|
||||
|
||||
stability:
|
||||
# Minimum stability (stable, RC, beta, alpha, dev)
|
||||
#
|
||||
# Required: false
|
||||
# Default: stable
|
||||
|
||||
cache-directories:
|
||||
# Additional directories to cache (comma-separated)
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
token:
|
||||
# GitHub token for private repository access
|
||||
#
|
||||
# Required: false
|
||||
# Default: ${{ github.token }}
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts for Composer commands
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
```
|
||||
@@ -1,59 +1,252 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Run Composer Install on defined PHP version
|
||||
description: "Runs Composer install on the repository"
|
||||
author: "Ismo Vuorinen"
|
||||
name: Run Composer Install
|
||||
description: 'Runs Composer install on a repository with advanced caching and configuration.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: package
|
||||
icon: server
|
||||
color: gray-dark
|
||||
|
||||
inputs:
|
||||
php:
|
||||
description: 'PHP Version to use'
|
||||
description: 'PHP Version to use.'
|
||||
required: true
|
||||
default: "8.3"
|
||||
default: '8.4'
|
||||
extensions:
|
||||
description: 'Comma-separated list of PHP extensions to install'
|
||||
required: false
|
||||
default: 'mbstring, xml, zip, curl, json'
|
||||
tools:
|
||||
description: 'Comma-separated list of Composer tools to install'
|
||||
required: false
|
||||
default: 'composer:v2'
|
||||
args:
|
||||
description: 'Arguments to pass to Composer'
|
||||
description: 'Arguments to pass to Composer.'
|
||||
required: false
|
||||
default: '--no-progress --prefer-dist --optimize-autoloader'
|
||||
composer-version:
|
||||
description: 'Composer version to use (1 or 2)'
|
||||
required: false
|
||||
default: '2'
|
||||
stability:
|
||||
description: 'Minimum stability (stable, RC, beta, alpha, dev)'
|
||||
required: false
|
||||
default: 'stable'
|
||||
cache-directories:
|
||||
description: 'Additional directories to cache (comma-separated)'
|
||||
required: false
|
||||
default: ''
|
||||
token:
|
||||
description: 'GitHub token for private repository access'
|
||||
required: false
|
||||
default: ${{ github.token }}
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts for Composer commands'
|
||||
required: false
|
||||
default: '3'
|
||||
|
||||
outputs:
|
||||
lock:
|
||||
description: "composer.lock or composer.json file hash"
|
||||
description: 'composer.lock or composer.json file hash'
|
||||
value: ${{ steps.hash.outputs.lock }}
|
||||
php-version:
|
||||
description: 'Installed PHP version'
|
||||
value: ${{ steps.php.outputs.version }}
|
||||
composer-version:
|
||||
description: 'Installed Composer version'
|
||||
value: ${{ steps.composer.outputs.version }}
|
||||
cache-hit:
|
||||
description: 'Indicates if there was a cache hit'
|
||||
value: ${{ steps.composer-cache.outputs.cache-hit }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
|
||||
steps:
|
||||
- name: Get composer.lock or composer.json hash for caching
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate PHP version
|
||||
if ! [[ "${{ inputs.php }}" =~ ^([5-9]\.[0-9]+|[1-9][0-9]+\.[0-9]+)$ ]]; then
|
||||
echo "::error::Invalid PHP version format: ${{ inputs.php }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate Composer version
|
||||
if ! [[ "${{ inputs.composer-version }}" =~ ^[12]$ ]]; then
|
||||
echo "::error::Invalid Composer version: ${{ inputs.composer-version }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate stability
|
||||
if ! [[ "${{ inputs.stability }}" =~ ^(stable|RC|beta|alpha|dev)$ ]]; then
|
||||
echo "::error::Invalid stability option: ${{ inputs.stability }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Setup PHP
|
||||
id: php
|
||||
uses: shivammathur/setup-php@v2
|
||||
with:
|
||||
php-version: ${{ inputs.php }}
|
||||
extensions: ${{ inputs.extensions }}
|
||||
tools: ${{ inputs.tools }}
|
||||
coverage: none
|
||||
ini-values: memory_limit=1G, max_execution_time=600
|
||||
fail-fast: true
|
||||
|
||||
- name: Get Dependency Hashes
|
||||
id: hash
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Function to calculate directory hash
|
||||
calculate_dir_hash() {
|
||||
local dir=$1
|
||||
if [ -d "$dir" ]; then
|
||||
find "$dir" -type f -exec sha256sum {} \; | sort | sha256sum | cut -d' ' -f1
|
||||
fi
|
||||
}
|
||||
|
||||
# Get composer.lock hash or composer.json hash
|
||||
if [ -f composer.lock ]; then
|
||||
echo "lock=${{ hashFiles('**/composer.lock') }}" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "lock=${{ hashFiles('**/composer.json') }}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
# Calculate additional directory hashes
|
||||
if [ -n "${{ inputs.cache-directories }}" ]; then
|
||||
IFS=',' read -ra DIRS <<< "${{ inputs.cache-directories }}"
|
||||
for dir in "${DIRS[@]}"; do
|
||||
dir_hash=$(calculate_dir_hash "$dir")
|
||||
if [ -n "$dir_hash" ]; then
|
||||
echo "${dir}_hash=$dir_hash" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
- name: Configure Composer
|
||||
id: composer
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Configure Composer environment
|
||||
composer config --global process-timeout 600
|
||||
composer config --global allow-plugins true
|
||||
composer config --global github-oauth.github.com "${{ inputs.token }}"
|
||||
|
||||
if [ "${{ inputs.stability }}" != "stable" ]; then
|
||||
composer config minimum-stability ${{ inputs.stability }}
|
||||
fi
|
||||
|
||||
# Verify Composer installation
|
||||
composer_full_version=$(composer --version | grep -oP 'Composer version \K[0-9]+\.[0-9]+\.[0-9]+')
|
||||
if [ -z "$composer_full_version" ]; then
|
||||
echo "::error::Failed to detect Composer version"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Extract major version for comparison
|
||||
composer_major_version=${composer_full_version%%.*}
|
||||
expected_version="${{ inputs.composer-version }}"
|
||||
|
||||
echo "Detected Composer version: $composer_full_version (major: $composer_major_version)"
|
||||
|
||||
if [ "$composer_major_version" != "$expected_version" ]; then
|
||||
echo "::error::Composer major version mismatch. Expected $expected_version.x, got $composer_full_version"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Store full version for output
|
||||
echo "version=$composer_full_version" >> "$GITHUB_OUTPUT"
|
||||
|
||||
# Log Composer configuration
|
||||
echo "Composer Configuration:"
|
||||
composer config --list
|
||||
|
||||
- name: Cache Composer packages
|
||||
id: composer-cache
|
||||
uses: actions/cache@v4
|
||||
with:
|
||||
path: vendor
|
||||
key: ${{ runner.os }}-php-${{ inputs.php }}-${{ runs.hash.outputs.lock }}
|
||||
path: |
|
||||
vendor
|
||||
~/.composer/cache
|
||||
${{ inputs.cache-directories }}
|
||||
key: ${{ runner.os }}-php-${{ inputs.php }}-composer-${{ inputs.composer-version }}-${{ steps.hash.outputs.lock }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-php-${{ inputs.php }}-${{ runs.hash.outputs.lock }}
|
||||
${{ runner.os }}-php-${{ inputs.php }}-composer-${{ inputs.composer-version }}-
|
||||
${{ runner.os }}-php-${{ inputs.php }}-composer-
|
||||
${{ runner.os }}-php-${{ inputs.php }}-
|
||||
${{ runner.os }}-php-
|
||||
|
||||
- name: PHP ${{ inputs.php }}
|
||||
uses: shivammathur/setup-php@v2
|
||||
with:
|
||||
php-version: ${{ inputs.php }}
|
||||
tools: composer
|
||||
extensions: ${{ inputs.extensions }}
|
||||
|
||||
- name: Composer Install
|
||||
- name: Install Dependencies
|
||||
shell: bash
|
||||
run: composer install ${{ inputs.args }}
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Function to run composer with retries
|
||||
run_composer() {
|
||||
local attempt=1
|
||||
local max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Composer install attempt $attempt of $max_attempts"
|
||||
|
||||
if composer install ${{ inputs.args }}; then
|
||||
return 0
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Composer install failed, waiting 30 seconds before retry..."
|
||||
sleep 30
|
||||
|
||||
# Clear composer cache if retry needed
|
||||
if [ $attempt -eq $max_attempts ]; then
|
||||
echo "Clearing Composer cache before final attempt..."
|
||||
composer clear-cache
|
||||
fi
|
||||
fi
|
||||
done
|
||||
|
||||
echo "::error::Composer install failed after $max_attempts attempts"
|
||||
return 1
|
||||
}
|
||||
|
||||
# Run Composer install with retry logic
|
||||
run_composer
|
||||
|
||||
- name: Verify Installation
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify vendor directory
|
||||
if [ ! -d "vendor" ]; then
|
||||
echo "::error::vendor directory not found"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Verify autoloader
|
||||
if [ ! -f "vendor/autoload.php" ]; then
|
||||
echo "::error::autoload.php not found"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Check for any PHP errors in vendor
|
||||
find vendor -name "*.php" -type f -exec php -l {} \; > /dev/null
|
||||
|
||||
# Verify Composer installation
|
||||
composer validate --no-check-all --strict
|
||||
|
||||
- name: Generate Optimized Autoloader
|
||||
if: success()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
composer dump-autoload --optimize --classmap-authoritative
|
||||
|
||||
59
php-laravel-phpunit/README.md
Normal file
59
php-laravel-phpunit/README.md
Normal file
@@ -0,0 +1,59 @@
|
||||
# ivuorinen/actions/php-laravel-phpunit
|
||||
|
||||
## Laravel Setup and Composer test
|
||||
|
||||
### Description
|
||||
|
||||
Setup PHP, install dependencies, generate key, create database and run composer test
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------------ | --------------------------------------------------------------------------------------------------------------------- | -------- | ------------------------------------------- |
|
||||
| `php-version` | <p>PHP Version to use, see https://github.com/marketplace/actions/setup-php-action#php-version-optional</p> | `false` | `latest` |
|
||||
| `php-version-file` | <p>PHP Version file to use, see https://github.com/marketplace/actions/setup-php-action#php-version-file-optional</p> | `false` | `.php-version` |
|
||||
| `extensions` | <p>PHP extensions to install, see https://github.com/marketplace/actions/setup-php-action#extensions-optional</p> | `false` | `mbstring, intl, json, pdo_sqlite, sqlite3` |
|
||||
| `coverage` | <p>Specify code-coverage driver, see https://github.com/marketplace/actions/setup-php-action#coverage-optional</p> | `false` | `none` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------------ | ---------------------------------------------- |
|
||||
| `php-version` | <p>The PHP version that was setup</p> |
|
||||
| `php-version-file` | <p>The PHP version file that was used</p> |
|
||||
| `extensions` | <p>The PHP extensions that were installed</p> |
|
||||
| `coverage` | <p>The code-coverage driver that was setup</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/php-laravel-phpunit@main
|
||||
with:
|
||||
php-version:
|
||||
# PHP Version to use, see https://github.com/marketplace/actions/setup-php-action#php-version-optional
|
||||
#
|
||||
# Required: false
|
||||
# Default: latest
|
||||
|
||||
php-version-file:
|
||||
# PHP Version file to use, see https://github.com/marketplace/actions/setup-php-action#php-version-file-optional
|
||||
#
|
||||
# Required: false
|
||||
# Default: .php-version
|
||||
|
||||
extensions:
|
||||
# PHP extensions to install, see https://github.com/marketplace/actions/setup-php-action#extensions-optional
|
||||
#
|
||||
# Required: false
|
||||
# Default: mbstring, intl, json, pdo_sqlite, sqlite3
|
||||
|
||||
coverage:
|
||||
# Specify code-coverage driver, see https://github.com/marketplace/actions/setup-php-action#coverage-optional
|
||||
#
|
||||
# Required: false
|
||||
# Default: none
|
||||
```
|
||||
@@ -1,53 +1,54 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Laravel Setup and Composer test
|
||||
description: 'Setup PHP, install dependencies, generate key, create database and run composer test'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
on:
|
||||
push:
|
||||
branches: [main]
|
||||
pull_request:
|
||||
branches: [main]
|
||||
workflow_call:
|
||||
inputs:
|
||||
branding:
|
||||
icon: 'terminal'
|
||||
color: 'blue'
|
||||
|
||||
inputs:
|
||||
php-version:
|
||||
description: 'PHP Version to use, see https://github.com/marketplace/actions/setup-php-action#php-version-optional'
|
||||
required: false
|
||||
default: 'latest'
|
||||
type: string
|
||||
php-version-file:
|
||||
description: 'PHP Version file to use, see https://github.com/marketplace/actions/setup-php-action#php-version-file-optional'
|
||||
required: false
|
||||
default: '.php-version'
|
||||
type: string
|
||||
extensions:
|
||||
description: 'PHP extensions to install, see https://github.com/marketplace/actions/setup-php-action#extensions-optional'
|
||||
required: false
|
||||
default: 'mbstring, intl, json, pdo_sqlite, sqlite3'
|
||||
type: string
|
||||
coverage:
|
||||
description: 'Specify code-coverage driver, see https://github.com/marketplace/actions/setup-php-action#coverage-optional'
|
||||
required: false
|
||||
default: 'none'
|
||||
type: string
|
||||
|
||||
jobs:
|
||||
laravel-tests:
|
||||
runs-on: ubuntu-latest
|
||||
|
||||
outputs:
|
||||
check_files: ${{ steps.check_files.outputs.files_exists }}
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
statuses: write
|
||||
outputs:
|
||||
php-version:
|
||||
description: 'The PHP version that was setup'
|
||||
value: ${{ steps.setup-php.outputs.php-version }}
|
||||
php-version-file:
|
||||
description: 'The PHP version file that was used'
|
||||
value: ${{ steps.setup-php.outputs.php-version-file }}
|
||||
extensions:
|
||||
description: 'The PHP extensions that were installed'
|
||||
value: ${{ steps.setup-php.outputs.extensions }}
|
||||
coverage:
|
||||
description: 'The code-coverage driver that was setup'
|
||||
value: ${{ steps.setup-php.outputs.coverage }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- uses: shivammathur/setup-php@v2
|
||||
with:
|
||||
php-version: ${{ github.event.inputs.php-version }}
|
||||
php-version-file: ${{ github.event.inputs.php-version-file }}
|
||||
extensions: ${{ github.event.inputs.extensions }}
|
||||
coverage: ${{ github.event.inputs.coverage }}
|
||||
php-version: ${{ inputs.php-version }}
|
||||
php-version-file: ${{ inputs.php-version-file }}
|
||||
extensions: ${{ inputs.extensions }}
|
||||
coverage: ${{ inputs.coverage }}
|
||||
|
||||
- uses: actions/checkout@v4
|
||||
|
||||
@@ -59,28 +60,34 @@ jobs:
|
||||
|
||||
- name: Copy .env
|
||||
if: steps.check_files.outputs.files_exists == 'true'
|
||||
shell: bash
|
||||
run: php -r "file_exists('.env') || copy('.env.example', '.env');"
|
||||
|
||||
- name: Install Dependencies
|
||||
if: steps.check_files.outputs.files_exists == 'true'
|
||||
shell: bash
|
||||
run: composer install -q --no-ansi --no-interaction --no-scripts --no-progress --prefer-dist
|
||||
|
||||
- name: Generate key
|
||||
if: steps.check_files.outputs.files_exists == 'true'
|
||||
shell: bash
|
||||
run: php artisan key:generate
|
||||
|
||||
- name: Directory Permissions
|
||||
if: steps.check_files.outputs.files_exists == 'true'
|
||||
shell: bash
|
||||
run: chmod -R 777 storage bootstrap/cache
|
||||
|
||||
- name: Create Database
|
||||
if: steps.check_files.outputs.files_exists == 'true'
|
||||
shell: bash
|
||||
run: |
|
||||
mkdir -p database
|
||||
touch database/database.sqlite
|
||||
|
||||
- name: Execute composer test (Unit and Feature tests)
|
||||
if: steps.check_files.outputs.files_exists == 'true'
|
||||
shell: bash
|
||||
env:
|
||||
DB_CONNECTION: sqlite
|
||||
DB_DATABASE: database/database.sqlite
|
||||
|
||||
17
php-tests/README.md
Normal file
17
php-tests/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/php-tests
|
||||
|
||||
## PHP Tests
|
||||
|
||||
### Description
|
||||
|
||||
Run PHPUnit tests on the repository
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/php-tests@main
|
||||
```
|
||||
22
php-tests/action.yml
Normal file
22
php-tests/action.yml
Normal file
@@ -0,0 +1,22 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: PHP Tests
|
||||
description: Run PHPUnit tests on the repository
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: check-circle
|
||||
color: green
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Set Git Config
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Composer Install
|
||||
uses: ivuorinen/actions/php-composer@main
|
||||
|
||||
- name: Run PHPUnit Tests
|
||||
shell: bash
|
||||
run: vendor/bin/phpunit --verbose
|
||||
17
pr-lint/README.md
Normal file
17
pr-lint/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/pr-lint
|
||||
|
||||
## MegaLinter
|
||||
|
||||
### Description
|
||||
|
||||
Run MegaLinter on the repository
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/pr-lint@main
|
||||
```
|
||||
157
pr-lint/action.yml
Normal file
157
pr-lint/action.yml
Normal file
@@ -0,0 +1,157 @@
|
||||
# MegaLinter GitHub Action configuration file
|
||||
# More info at https://megalinter.io
|
||||
---
|
||||
name: MegaLinter
|
||||
description: Run MegaLinter on the repository
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: check-circle
|
||||
color: green
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
# Git Checkout
|
||||
- name: Checkout Code
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
token: ${{ secrets.PAT || secrets.GITHUB_TOKEN }}
|
||||
|
||||
# If you use VALIDATE_ALL_CODEBASE = true, you can remove this line to
|
||||
# improve performance
|
||||
fetch-depth: 0
|
||||
|
||||
- name: Setup Git Config
|
||||
id: git-config
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
# MegaLinter
|
||||
- name: MegaLinter
|
||||
# You can override MegaLinter flavor used to have faster performances
|
||||
# More info at https://megalinter.io/latest/flavors/
|
||||
uses: oxsecurity/megalinter@v8
|
||||
id: ml
|
||||
|
||||
# All available variables are described in documentation
|
||||
# https://megalinter.io/latest/configuration/
|
||||
env:
|
||||
# Validates all source when push on main, else just the git diff with
|
||||
# main. Override with true if you always want to lint all sources
|
||||
#
|
||||
# To validate the entire codebase, set to:
|
||||
# VALIDATE_ALL_CODEBASE: true
|
||||
#
|
||||
# To validate only diff with main, set to:
|
||||
# VALIDATE_ALL_CODEBASE: >-
|
||||
# ${{
|
||||
# github.event_name == 'push' &&
|
||||
# contains(fromJSON('["refs/heads/main", "refs/heads/master"]'), github.ref)
|
||||
# }}
|
||||
VALIDATE_ALL_CODEBASE: >-
|
||||
${{
|
||||
github.event_name == 'push' &&
|
||||
contains(fromJSON('["refs/heads/main", "refs/heads/master"]'), github.ref)
|
||||
}}
|
||||
|
||||
GITHUB_TOKEN: ${{ steps.git-config.outputs.token || secrets.PAT || secrets.GITHUB_TOKEN }}
|
||||
|
||||
# Apply linter fixes configuration
|
||||
#
|
||||
# When active, APPLY_FIXES must also be defined as environment variable
|
||||
# (in .github/workflows/mega-linter.yml or other CI tool)
|
||||
APPLY_FIXES: all
|
||||
|
||||
# Decide which event triggers application of fixes in a commit or a PR
|
||||
# (pull_request, push, all)
|
||||
APPLY_FIXES_EVENT: pull_request
|
||||
|
||||
# If APPLY_FIXES is used, defines if the fixes are directly committed (commit)
|
||||
# or posted in a PR (pull_request)
|
||||
APPLY_FIXES_MODE: commit
|
||||
|
||||
# ADD YOUR CUSTOM ENV VARIABLES HERE OR DEFINE THEM IN A FILE
|
||||
# .mega-linter.yml AT THE ROOT OF YOUR REPOSITORY
|
||||
|
||||
# Uncomment to disable copy-paste and spell checks
|
||||
DISABLE: COPYPASTE,SPELL
|
||||
|
||||
# Upload MegaLinter artifacts
|
||||
- name: Archive production artifacts
|
||||
if: success() || failure()
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: MegaLinter reports
|
||||
include-hidden-files: 'true'
|
||||
path: |
|
||||
megalinter-reports
|
||||
mega-linter.log
|
||||
|
||||
# Set APPLY_FIXES_IF var for use in future steps
|
||||
- name: Set APPLY_FIXES_IF var
|
||||
run: |
|
||||
printf 'APPLY_FIXES_IF=%s\n' "${{
|
||||
steps.ml.outputs.has_updated_sources == 1 &&
|
||||
(
|
||||
env.APPLY_FIXES_EVENT == 'all' ||
|
||||
env.APPLY_FIXES_EVENT == github.event_name
|
||||
) &&
|
||||
(
|
||||
github.event_name == 'push' ||
|
||||
github.event.pull_request.head.repo.full_name == github.repository
|
||||
)
|
||||
}}" >> "${GITHUB_ENV}"
|
||||
|
||||
# Set APPLY_FIXES_IF_* vars for use in future steps
|
||||
- name: Set APPLY_FIXES_IF_* vars
|
||||
shell: bash
|
||||
run: |
|
||||
printf 'APPLY_FIXES_IF_PR=%s\n' "${{
|
||||
env.APPLY_FIXES_IF == 'true' &&
|
||||
env.APPLY_FIXES_MODE == 'pull_request'
|
||||
}}" >> "${GITHUB_ENV}"
|
||||
printf 'APPLY_FIXES_IF_COMMIT=%s\n' "${{
|
||||
env.APPLY_FIXES_IF == 'true' &&
|
||||
env.APPLY_FIXES_MODE == 'commit' &&
|
||||
(!contains(fromJSON('["refs/heads/main", "refs/heads/master"]'), github.ref))
|
||||
}}" >> "${GITHUB_ENV}"
|
||||
|
||||
# Create pull request if applicable
|
||||
# (for now works only on PR from same repository, not from forks)
|
||||
- name: Create Pull Request with applied fixes
|
||||
uses: peter-evans/create-pull-request@v6
|
||||
id: cpr
|
||||
if: env.APPLY_FIXES_IF_PR == 'true'
|
||||
with:
|
||||
token: ${{ steps.git-config.outputs.token || secrets.PAT || secrets.GITHUB_TOKEN }}
|
||||
commit-message: '[MegaLinter] Apply linters automatic fixes'
|
||||
title: '[MegaLinter] Apply linters automatic fixes'
|
||||
labels: bot
|
||||
|
||||
- name: Create PR output
|
||||
if: env.APPLY_FIXES_IF_PR == 'true'
|
||||
shell: bash
|
||||
run: |
|
||||
echo "PR Number - ${{ steps.cpr.outputs.pull-request-number }}"
|
||||
echo "PR URL - ${{ steps.cpr.outputs.pull-request-url }}"
|
||||
|
||||
# Push new commit if applicable
|
||||
# (for now works only on PR from same repository, not from forks)
|
||||
- name: Prepare commit
|
||||
if: env.APPLY_FIXES_IF_COMMIT == 'true'
|
||||
shell: bash
|
||||
run: sudo chown -Rc $UID .git/
|
||||
|
||||
- name: Commit and push applied linter fixes
|
||||
uses: stefanzweifel/git-auto-commit-action@v4
|
||||
if: env.APPLY_FIXES_IF_COMMIT == 'true'
|
||||
with:
|
||||
branch: >-
|
||||
${{
|
||||
github.event.pull_request.head.ref ||
|
||||
github.head_ref ||
|
||||
github.ref
|
||||
}}
|
||||
commit_message: '[MegaLinter] Apply linters fixes'
|
||||
commit_user_name: ${{ steps.git-config.outputs.username }}
|
||||
commit_user_email: ${{ steps.git-config.outputs.email }}
|
||||
57
pre-commit/README.md
Normal file
57
pre-commit/README.md
Normal file
@@ -0,0 +1,57 @@
|
||||
# ivuorinen/actions/pre-commit
|
||||
|
||||
## pre-commit
|
||||
|
||||
### Description
|
||||
|
||||
Runs pre-commit on the repository and pushes the fixes back to the repository
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------------- | ------------------------------------- | -------- | --------------------------- |
|
||||
| `pre-commit-config` | <p>pre-commit configuration file</p> | `false` | `.pre-commit-config.yaml` |
|
||||
| `base-branch` | <p>Base branch to compare against</p> | `false` | `""` |
|
||||
| `token` | <p>GitHub Token</p> | `false` | `${{ github.token }}` |
|
||||
| `commit_user` | <p>Commit user</p> | `false` | `GitHub Actions` |
|
||||
| `commit_email` | <p>Commit email</p> | `false` | `github-actions@github.com` |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/pre-commit@main
|
||||
with:
|
||||
pre-commit-config:
|
||||
# pre-commit configuration file
|
||||
#
|
||||
# Required: false
|
||||
# Default: .pre-commit-config.yaml
|
||||
|
||||
base-branch:
|
||||
# Base branch to compare against
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
token:
|
||||
# GitHub Token
|
||||
#
|
||||
# Required: false
|
||||
# Default: ${{ github.token }}
|
||||
|
||||
commit_user:
|
||||
# Commit user
|
||||
#
|
||||
# Required: false
|
||||
# Default: GitHub Actions
|
||||
|
||||
commit_email:
|
||||
# Commit email
|
||||
#
|
||||
# Required: false
|
||||
# Default: github-actions@github.com
|
||||
```
|
||||
@@ -1,5 +1,8 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: pre-commit
|
||||
description: "Runs pre-commit on the repository and pushes the fixes back to the repository"
|
||||
description: 'Runs pre-commit on the repository and pushes the fixes back to the repository'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: check-square
|
||||
@@ -7,23 +10,24 @@ branding:
|
||||
|
||||
inputs:
|
||||
pre-commit-config:
|
||||
description: "pre-commit configuration file"
|
||||
required: true
|
||||
description: 'pre-commit configuration file'
|
||||
required: false
|
||||
default: '.pre-commit-config.yaml'
|
||||
base-branch:
|
||||
description: "Base branch to compare against"
|
||||
description: 'Base branch to compare against'
|
||||
required: false
|
||||
token:
|
||||
description: "GitHub Token"
|
||||
description: 'GitHub Token'
|
||||
required: false
|
||||
default: ${{ github.token }}
|
||||
commit_user:
|
||||
description: "Commit user"
|
||||
description: 'Commit user'
|
||||
required: false
|
||||
default: "GitHub Actions"
|
||||
default: 'GitHub Actions'
|
||||
commit_email:
|
||||
description: "Commit email"
|
||||
description: 'Commit email'
|
||||
required: false
|
||||
default: "github-actions@github.com"
|
||||
default: 'github-actions@github.com'
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
@@ -54,5 +58,5 @@ runs:
|
||||
if: always() # Push changes even when pre-commit fails
|
||||
uses: stefanzweifel/git-auto-commit-action@v5
|
||||
with:
|
||||
commit_message: "style(pre-commit): autofix"
|
||||
commit_message: 'style(pre-commit): autofix'
|
||||
add_options: -u
|
||||
|
||||
108
prettier-check/README.md
Normal file
108
prettier-check/README.md
Normal file
@@ -0,0 +1,108 @@
|
||||
# ivuorinen/actions/prettier-check
|
||||
|
||||
## Prettier Check
|
||||
|
||||
### Description
|
||||
|
||||
Run Prettier check on the repository with advanced configuration and reporting
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------------- | ---------------------------------------------------------- | -------- | ------------------------------------------------ |
|
||||
| `working-directory` | <p>Directory containing files to check</p> | `false` | `.` |
|
||||
| `prettier-version` | <p>Prettier version to use</p> | `false` | `latest` |
|
||||
| `config-file` | <p>Path to Prettier config file</p> | `false` | `.prettierrc` |
|
||||
| `ignore-file` | <p>Path to Prettier ignore file</p> | `false` | `.prettierignore` |
|
||||
| `file-pattern` | <p>Files to include (glob pattern)</p> | `false` | `**/*.{js,jsx,ts,tsx,css,scss,json,md,yaml,yml}` |
|
||||
| `cache` | <p>Enable Prettier caching</p> | `false` | `true` |
|
||||
| `fail-on-error` | <p>Fail workflow if issues are found</p> | `false` | `true` |
|
||||
| `report-format` | <p>Output format (json, sarif)</p> | `false` | `sarif` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts</p> | `false` | `3` |
|
||||
| `plugins` | <p>Comma-separated list of Prettier plugins to install</p> | `false` | `""` |
|
||||
| `check-only` | <p>Only check for formatting issues without fixing</p> | `false` | `true` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------------- | --------------------------------------------- |
|
||||
| `files-checked` | <p>Number of files checked</p> |
|
||||
| `unformatted-files` | <p>Number of files with formatting issues</p> |
|
||||
| `sarif-file` | <p>Path to SARIF report file</p> |
|
||||
| `cache-hit` | <p>Indicates if there was a cache hit</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/prettier-check@main
|
||||
with:
|
||||
working-directory:
|
||||
# Directory containing files to check
|
||||
#
|
||||
# Required: false
|
||||
# Default: .
|
||||
|
||||
prettier-version:
|
||||
# Prettier version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: latest
|
||||
|
||||
config-file:
|
||||
# Path to Prettier config file
|
||||
#
|
||||
# Required: false
|
||||
# Default: .prettierrc
|
||||
|
||||
ignore-file:
|
||||
# Path to Prettier ignore file
|
||||
#
|
||||
# Required: false
|
||||
# Default: .prettierignore
|
||||
|
||||
file-pattern:
|
||||
# Files to include (glob pattern)
|
||||
#
|
||||
# Required: false
|
||||
# Default: **/*.{js,jsx,ts,tsx,css,scss,json,md,yaml,yml}
|
||||
|
||||
cache:
|
||||
# Enable Prettier caching
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
fail-on-error:
|
||||
# Fail workflow if issues are found
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
|
||||
report-format:
|
||||
# Output format (json, sarif)
|
||||
#
|
||||
# Required: false
|
||||
# Default: sarif
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
|
||||
plugins:
|
||||
# Comma-separated list of Prettier plugins to install
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
|
||||
check-only:
|
||||
# Only check for formatting issues without fixing
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
```
|
||||
328
prettier-check/action.yml
Normal file
328
prettier-check/action.yml
Normal file
@@ -0,0 +1,328 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Prettier Check
|
||||
description: 'Run Prettier check on the repository with advanced configuration and reporting'
|
||||
author: Ismo Vuorinen
|
||||
|
||||
branding:
|
||||
icon: check-circle
|
||||
color: green
|
||||
|
||||
inputs:
|
||||
working-directory:
|
||||
description: 'Directory containing files to check'
|
||||
required: false
|
||||
default: '.'
|
||||
prettier-version:
|
||||
description: 'Prettier version to use'
|
||||
required: false
|
||||
default: 'latest'
|
||||
config-file:
|
||||
description: 'Path to Prettier config file'
|
||||
required: false
|
||||
default: '.prettierrc'
|
||||
ignore-file:
|
||||
description: 'Path to Prettier ignore file'
|
||||
required: false
|
||||
default: '.prettierignore'
|
||||
file-pattern:
|
||||
description: 'Files to include (glob pattern)'
|
||||
required: false
|
||||
default: '**/*.{js,jsx,ts,tsx,css,scss,json,md,yaml,yml}'
|
||||
cache:
|
||||
description: 'Enable Prettier caching'
|
||||
required: false
|
||||
default: 'true'
|
||||
fail-on-error:
|
||||
description: 'Fail workflow if issues are found'
|
||||
required: false
|
||||
default: 'true'
|
||||
report-format:
|
||||
description: 'Output format (json, sarif)'
|
||||
required: false
|
||||
default: 'sarif'
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts'
|
||||
required: false
|
||||
default: '3'
|
||||
plugins:
|
||||
description: 'Comma-separated list of Prettier plugins to install'
|
||||
required: false
|
||||
default: ''
|
||||
check-only:
|
||||
description: 'Only check for formatting issues without fixing'
|
||||
required: false
|
||||
default: 'true'
|
||||
|
||||
outputs:
|
||||
files-checked:
|
||||
description: 'Number of files checked'
|
||||
value: ${{ steps.check.outputs.files_checked }}
|
||||
unformatted-files:
|
||||
description: 'Number of files with formatting issues'
|
||||
value: ${{ steps.check.outputs.unformatted_files }}
|
||||
sarif-file:
|
||||
description: 'Path to SARIF report file'
|
||||
value: ${{ steps.check.outputs.sarif_file }}
|
||||
cache-hit:
|
||||
description: 'Indicates if there was a cache hit'
|
||||
value: ${{ steps.cache.outputs.cache-hit }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Validate Inputs
|
||||
id: validate
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate working directory
|
||||
if [ ! -d "${{ inputs.working-directory }}" ]; then
|
||||
echo "::error::Working directory does not exist: ${{ inputs.working-directory }}"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate glob pattern
|
||||
if ! echo "${{ inputs.file-pattern }}" | grep -qE '^[*{}\[\].,a-zA-Z0-9/_-]+$'; then
|
||||
echo "::error::Invalid file pattern format"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate plugins format if provided
|
||||
if [ -n "${{ inputs.plugins }}" ]; then
|
||||
if ! echo "${{ inputs.plugins }}" | grep -qE '^[a-zA-Z0-9/@._,-]+$'; then
|
||||
echo "::error::Invalid plugins format"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: ivuorinen/actions/node-setup@main
|
||||
|
||||
- name: Set up Cache
|
||||
id: cache
|
||||
uses: actions/cache@v4
|
||||
if: inputs.cache == 'true'
|
||||
with:
|
||||
path: |
|
||||
node_modules/.cache/prettier
|
||||
.prettier-cache
|
||||
key: ${{ runner.os }}-prettier-${{ hashFiles('**/package.json', '**/package-lock.json', '${{ inputs.config-file }}') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-prettier-
|
||||
|
||||
- name: Install Dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Function to install with retries
|
||||
install_with_retries() {
|
||||
local attempt=1
|
||||
local max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Installation attempt $attempt of $max_attempts"
|
||||
|
||||
# Install Prettier and base dependencies
|
||||
if npm install \
|
||||
prettier@${{ inputs.prettier-version }} \
|
||||
@prettier/plugin-xml \
|
||||
prettier-plugin-packagejson \
|
||||
prettier-plugin-sh; then
|
||||
|
||||
# Install additional plugins if specified
|
||||
if [ -n "${{ inputs.plugins }}" ]; then
|
||||
IFS=',' read -ra PLUGINS <<< "${{ inputs.plugins }}"
|
||||
for plugin in "${PLUGINS[@]}"; do
|
||||
if ! npm install "$plugin"; then
|
||||
return 1
|
||||
fi
|
||||
done
|
||||
fi
|
||||
|
||||
return 0
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Installation failed, waiting 10 seconds before retry..."
|
||||
sleep 10
|
||||
fi
|
||||
done
|
||||
|
||||
echo "::error::Failed to install dependencies after $max_attempts attempts"
|
||||
return 1
|
||||
}
|
||||
|
||||
install_with_retries
|
||||
|
||||
- name: Prepare Configuration
|
||||
id: config
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create default config if none exists
|
||||
if [ ! -f "${{ inputs.config-file }}" ]; then
|
||||
echo "Creating default Prettier configuration..."
|
||||
cat > "${{ inputs.config-file }}" <<EOF
|
||||
{
|
||||
"semi": true,
|
||||
"singleQuote": true,
|
||||
"tabWidth": 2,
|
||||
"useTabs": false,
|
||||
"trailingComma": "es5",
|
||||
"printWidth": 100,
|
||||
"arrowParens": "avoid",
|
||||
"endOfLine": "lf"
|
||||
}
|
||||
EOF
|
||||
fi
|
||||
|
||||
# Create default ignore file if none exists
|
||||
if [ ! -f "${{ inputs.ignore-file }}" ]; then
|
||||
echo "Creating default Prettier ignore file..."
|
||||
cat > "${{ inputs.ignore-file }}" <<EOF
|
||||
node_modules/
|
||||
dist/
|
||||
build/
|
||||
coverage/
|
||||
.git/
|
||||
*.min.js
|
||||
*.d.ts
|
||||
EOF
|
||||
fi
|
||||
|
||||
- name: Run Prettier Check
|
||||
id: check
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create reports directory
|
||||
mkdir -p reports
|
||||
|
||||
# Function to convert Prettier output to SARIF
|
||||
prettier_to_sarif() {
|
||||
local input_file=$1
|
||||
local output_file=$2
|
||||
|
||||
echo '{
|
||||
"$schema": "https://raw.githubusercontent.com/oasis-tcs/sarif-spec/master/Schemata/sarif-schema-2.1.0.json",
|
||||
"version": "2.1.0",
|
||||
"runs": [
|
||||
{
|
||||
"tool": {
|
||||
"driver": {
|
||||
"name": "Prettier",
|
||||
"informationUri": "https://prettier.io",
|
||||
"rules": []
|
||||
}
|
||||
},
|
||||
"results": [' > "$output_file"
|
||||
|
||||
local first=true
|
||||
while IFS= read -r line; do
|
||||
if [ "$first" = true ]; then
|
||||
first=false
|
||||
else
|
||||
echo "," >> "$output_file"
|
||||
fi
|
||||
|
||||
echo "{
|
||||
\"level\": \"error\",
|
||||
\"message\": {
|
||||
\"text\": \"File is not formatted according to Prettier rules\"
|
||||
},
|
||||
\"locations\": [
|
||||
{
|
||||
\"physicalLocation\": {
|
||||
\"artifactLocation\": {
|
||||
\"uri\": \"$line\"
|
||||
}
|
||||
}
|
||||
}
|
||||
]
|
||||
}" >> "$output_file"
|
||||
done < "$input_file"
|
||||
|
||||
echo ']}]}' >> "$output_file"
|
||||
}
|
||||
|
||||
# Run Prettier
|
||||
echo "Running Prettier check..."
|
||||
unformatted_files=$(mktemp)
|
||||
|
||||
if [ "${{ inputs.check-only }}" = "true" ]; then
|
||||
npx prettier \
|
||||
--check \
|
||||
--config "${{ inputs.config-file }}" \
|
||||
--ignore-path "${{ inputs.ignore-file }}" \
|
||||
${{ inputs.cache == 'true' && '--cache --cache-location=.prettier-cache' || '' }} \
|
||||
--no-error-on-unmatched-pattern \
|
||||
"${{ inputs.file-pattern }}" 2>&1 | \
|
||||
grep -oE '[^ ]+\.[a-zA-Z]+$' > "$unformatted_files" || true
|
||||
else
|
||||
npx prettier \
|
||||
--write \
|
||||
--list-different \
|
||||
--config "${{ inputs.config-file }}" \
|
||||
--ignore-path "${{ inputs.ignore-file }}" \
|
||||
${{ inputs.cache == 'true' && '--cache --cache-location=.prettier-cache' || '' }} \
|
||||
--no-error-on-unmatched-pattern \
|
||||
"${{ inputs.file-pattern }}" > "$unformatted_files" || true
|
||||
fi
|
||||
|
||||
# Count files
|
||||
files_checked=$(find . -type f -name "${{ inputs.file-pattern }}" -not -path "*/node_modules/*" | wc -l)
|
||||
unformatted_count=$(wc -l < "$unformatted_files")
|
||||
|
||||
echo "files_checked=${files_checked}" >> $GITHUB_OUTPUT
|
||||
echo "unformatted_files=${unformatted_count}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Generate SARIF report if requested
|
||||
if [ "${{ inputs.report-format }}" = "sarif" ]; then
|
||||
prettier_to_sarif "$unformatted_files" "reports/prettier.sarif"
|
||||
echo "sarif_file=reports/prettier.sarif" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
# Clean up temporary file
|
||||
rm "$unformatted_files"
|
||||
|
||||
# Exit with error if issues found and fail-on-error is true
|
||||
if [ "${{ inputs.fail-on-error }}" = "true" ] && [ "$unformatted_count" -gt 0 ]; then
|
||||
echo "::error::Found $unformatted_count files with formatting issues"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
- name: Upload Prettier Results
|
||||
if: always() && inputs.report-format == 'sarif'
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: ${{ inputs.working-directory }}/reports/prettier.sarif
|
||||
category: prettier
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Remove temporary files
|
||||
rm -rf reports/
|
||||
|
||||
# Clean cache if exists and not being preserved
|
||||
if [ "${{ inputs.cache }}" != "true" ]; then
|
||||
rm -rf .prettier-cache
|
||||
rm -rf node_modules/.cache/prettier
|
||||
fi
|
||||
17
prettier-fix/README.md
Normal file
17
prettier-fix/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/prettier-fix
|
||||
|
||||
## Prettier Fix
|
||||
|
||||
### Description
|
||||
|
||||
Run Prettier to fix code style violations
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/prettier-fix@main
|
||||
```
|
||||
38
prettier-fix/action.yml
Normal file
38
prettier-fix/action.yml
Normal file
@@ -0,0 +1,38 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Prettier Fix
|
||||
description: Run Prettier to fix code style violations
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'code'
|
||||
color: 'blue'
|
||||
|
||||
runs:
|
||||
using: 'composite'
|
||||
steps:
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
|
||||
- name: Set Git Config
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Node Setup
|
||||
uses: ivuorinen/actions/node-setup@main
|
||||
|
||||
- name: Install Dependencies
|
||||
shell: bash
|
||||
run: |
|
||||
npm install
|
||||
|
||||
- name: Run Prettier Fix
|
||||
shell: bash
|
||||
run: |
|
||||
npx prettier --write .
|
||||
|
||||
- name: Push Fixes
|
||||
if: always()
|
||||
uses: stefanzweifel/git-auto-commit-action@v5
|
||||
with:
|
||||
commit_message: 'style: autofix Prettier violations'
|
||||
add_options: '-u'
|
||||
72
python-lint-fix/README.md
Normal file
72
python-lint-fix/README.md
Normal file
@@ -0,0 +1,72 @@
|
||||
# ivuorinen/actions/python-lint-fix
|
||||
|
||||
## Python Lint and Fix
|
||||
|
||||
### Description
|
||||
|
||||
Lints and fixes Python files, commits changes, and uploads SARIF report.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ------------------- | --------------------------------------------------------------------- | -------- | ------- |
|
||||
| `python-version` | <p>Python version to use</p> | `false` | `3.11` |
|
||||
| `flake8-version` | <p>Flake8 version to use</p> | `false` | `7.0.0` |
|
||||
| `autopep8-version` | <p>Autopep8 version to use</p> | `false` | `2.0.4` |
|
||||
| `max-retries` | <p>Maximum number of retry attempts for installations and linting</p> | `false` | `3` |
|
||||
| `working-directory` | <p>Directory containing Python files to lint</p> | `false` | `.` |
|
||||
| `fail-on-error` | <p>Whether to fail the action if linting errors are found</p> | `false` | `true` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ------------- | ------------------------------------------------------ |
|
||||
| `lint-result` | <p>Result of the linting process (success/failure)</p> |
|
||||
| `fixed-files` | <p>Number of files that were fixed</p> |
|
||||
| `error-count` | <p>Number of errors found</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/python-lint-fix@main
|
||||
with:
|
||||
python-version:
|
||||
# Python version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3.11
|
||||
|
||||
flake8-version:
|
||||
# Flake8 version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: 7.0.0
|
||||
|
||||
autopep8-version:
|
||||
# Autopep8 version to use
|
||||
#
|
||||
# Required: false
|
||||
# Default: 2.0.4
|
||||
|
||||
max-retries:
|
||||
# Maximum number of retry attempts for installations and linting
|
||||
#
|
||||
# Required: false
|
||||
# Default: 3
|
||||
|
||||
working-directory:
|
||||
# Directory containing Python files to lint
|
||||
#
|
||||
# Required: false
|
||||
# Default: .
|
||||
|
||||
fail-on-error:
|
||||
# Whether to fail the action if linting errors are found
|
||||
#
|
||||
# Required: false
|
||||
# Default: true
|
||||
```
|
||||
231
python-lint-fix/action.yml
Normal file
231
python-lint-fix/action.yml
Normal file
@@ -0,0 +1,231 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Python Lint and Fix
|
||||
description: 'Lints and fixes Python files, commits changes, and uploads SARIF report.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: 'code'
|
||||
color: 'yellow'
|
||||
|
||||
inputs:
|
||||
python-version:
|
||||
description: 'Python version to use'
|
||||
required: false
|
||||
default: '3.11'
|
||||
flake8-version:
|
||||
description: 'Flake8 version to use'
|
||||
required: false
|
||||
default: '7.0.0'
|
||||
autopep8-version:
|
||||
description: 'Autopep8 version to use'
|
||||
required: false
|
||||
default: '2.0.4'
|
||||
max-retries:
|
||||
description: 'Maximum number of retry attempts for installations and linting'
|
||||
required: false
|
||||
default: '3'
|
||||
working-directory:
|
||||
description: 'Directory containing Python files to lint'
|
||||
required: false
|
||||
default: '.'
|
||||
fail-on-error:
|
||||
description: 'Whether to fail the action if linting errors are found'
|
||||
required: false
|
||||
default: 'true'
|
||||
|
||||
outputs:
|
||||
lint-result:
|
||||
description: 'Result of the linting process (success/failure)'
|
||||
value: ${{ steps.lint.outputs.result }}
|
||||
fixed-files:
|
||||
description: 'Number of files that were fixed'
|
||||
value: ${{ steps.fix.outputs.fixed_count }}
|
||||
error-count:
|
||||
description: 'Number of errors found'
|
||||
value: ${{ steps.lint.outputs.error_count }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Setup Python
|
||||
uses: actions/setup-python@v5
|
||||
with:
|
||||
python-version: ${{ inputs.python-version }}
|
||||
cache: 'pip'
|
||||
cache-dependency-path: |
|
||||
**/requirements.txt
|
||||
**/requirements-dev.txt
|
||||
**/pyproject.toml
|
||||
**/setup.py
|
||||
|
||||
- name: Check for Python Files
|
||||
id: check-files
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
if ! find . -name "*.py" -type f -not -path "*/\.*" | grep -q .; then
|
||||
echo "No Python files found. Skipping lint and fix."
|
||||
echo "result=skipped" >> $GITHUB_OUTPUT
|
||||
exit 0
|
||||
fi
|
||||
echo "result=found" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Install Dependencies
|
||||
if: steps.check-files.outputs.result == 'found'
|
||||
id: install
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
function install_with_retry() {
|
||||
local package=$1
|
||||
local version=$2
|
||||
local attempt=1
|
||||
local max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Installing $package==$version (Attempt $attempt of $max_attempts)"
|
||||
if pip install "$package==$version"; then
|
||||
return 0
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Installation failed, waiting 5 seconds before retry..."
|
||||
sleep 5
|
||||
fi
|
||||
done
|
||||
|
||||
echo "::error::Failed to install $package after $max_attempts attempts"
|
||||
return 1
|
||||
}
|
||||
|
||||
# Create virtual environment
|
||||
python -m venv .venv
|
||||
source .venv/bin/activate
|
||||
|
||||
# Install dependencies with retry logic
|
||||
install_with_retry flake8 ${{ inputs.flake8-version }}
|
||||
install_with_retry autopep8 ${{ inputs.autopep8-version }}
|
||||
|
||||
# Verify installations
|
||||
flake8 --version || exit 1
|
||||
autopep8 --version || exit 1
|
||||
|
||||
- name: Run flake8
|
||||
if: steps.check-files.outputs.result == 'found'
|
||||
id: lint
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
source .venv/bin/activate
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create temporary directory for reports
|
||||
mkdir -p reports
|
||||
|
||||
# Run flake8 with error handling
|
||||
error_count=0
|
||||
if ! flake8 --format=sarif --output-file=reports/flake8.sarif .; then
|
||||
error_count=$(grep -c "level\": \"error\"" reports/flake8.sarif || echo 0)
|
||||
echo "Found $error_count linting errors"
|
||||
echo "error_count=$error_count" >> $GITHUB_OUTPUT
|
||||
|
||||
if [[ "${{ inputs.fail-on-error }}" == "true" ]]; then
|
||||
echo "::error::Linting failed with $error_count errors"
|
||||
echo "result=failure" >> $GITHUB_OUTPUT
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "result=success" >> $GITHUB_OUTPUT
|
||||
echo "error_count=$error_count" >> $GITHUB_OUTPUT
|
||||
|
||||
- name: Run autopep8 Fix
|
||||
if: steps.check-files.outputs.result == 'found'
|
||||
id: fix
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
source .venv/bin/activate
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Create temporary file for tracking changes
|
||||
touch /tmp/changed_files
|
||||
|
||||
# Run autopep8 with change detection
|
||||
find . -name "*.py" -type f -not -path "*/\.*" | while read -r file; do
|
||||
if autopep8 --diff "$file" | grep -q '^[+-]'; then
|
||||
autopep8 --in-place "$file"
|
||||
echo "$file" >> /tmp/changed_files
|
||||
fi
|
||||
done
|
||||
|
||||
# Count fixed files
|
||||
fixed_count=$(wc -l < /tmp/changed_files || echo 0)
|
||||
echo "Fixed $fixed_count files"
|
||||
echo "fixed_count=$fixed_count" >> $GITHUB_OUTPUT
|
||||
|
||||
# Cleanup
|
||||
rm /tmp/changed_files
|
||||
|
||||
- name: Set Git Config for Fixes
|
||||
if: steps.fix.outputs.fixed_count > 0
|
||||
uses: ivuorinen/actions/set-git-config@main
|
||||
|
||||
- name: Commit Fixes
|
||||
if: steps.fix.outputs.fixed_count > 0
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
cd ${{ inputs.working-directory }}
|
||||
|
||||
# Commit changes with retry logic
|
||||
attempt=1
|
||||
max_attempts=${{ inputs.max-retries }}
|
||||
|
||||
while [ $attempt -le $max_attempts ]; do
|
||||
echo "Attempting to commit and push changes (Attempt $attempt of $max_attempts)"
|
||||
|
||||
git add .
|
||||
git commit -m "fix: applied python lint fixes to ${{ steps.fix.outputs.fixed_count }} files"
|
||||
|
||||
if git pull --rebase && git push; then
|
||||
echo "Successfully pushed changes"
|
||||
break
|
||||
fi
|
||||
|
||||
attempt=$((attempt + 1))
|
||||
if [ $attempt -le $max_attempts ]; then
|
||||
echo "Push failed, waiting 5 seconds before retry..."
|
||||
sleep 5
|
||||
else
|
||||
echo "::error::Failed to push changes after $max_attempts attempts"
|
||||
exit 1
|
||||
fi
|
||||
done
|
||||
|
||||
- name: Upload SARIF Report
|
||||
if: steps.check-files.outputs.result == 'found'
|
||||
uses: github/codeql-action/upload-sarif@v2
|
||||
with:
|
||||
sarif_file: ${{ inputs.working-directory }}/reports/flake8.sarif
|
||||
category: 'python-lint'
|
||||
|
||||
- name: Cleanup
|
||||
if: always()
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Remove virtual environment
|
||||
rm -rf .venv
|
||||
|
||||
# Remove temporary files
|
||||
rm -rf reports
|
||||
51
release-monthly/README.md
Normal file
51
release-monthly/README.md
Normal file
@@ -0,0 +1,51 @@
|
||||
# ivuorinen/actions/release-monthly
|
||||
|
||||
## Do Monthly Release
|
||||
|
||||
### Description
|
||||
|
||||
Creates a release for the current month, incrementing patch number if necessary.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| --------- | -------------------------------------------------------- | -------- | ------- |
|
||||
| `token` | <p>GitHub token with permission to create releases.</p> | `true` | `""` |
|
||||
| `dry-run` | <p>Run in dry-run mode without creating the release.</p> | `false` | `false` |
|
||||
| `prefix` | <p>Optional prefix for release tags.</p> | `false` | `""` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| -------------- | ------------------------------------- |
|
||||
| `release-tag` | <p>The tag of the created release</p> |
|
||||
| `release-url` | <p>The URL of the created release</p> |
|
||||
| `previous-tag` | <p>The previous release tag</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/release-monthly@main
|
||||
with:
|
||||
token:
|
||||
# GitHub token with permission to create releases.
|
||||
#
|
||||
# Required: true
|
||||
# Default: ""
|
||||
|
||||
dry-run:
|
||||
# Run in dry-run mode without creating the release.
|
||||
#
|
||||
# Required: false
|
||||
# Default: false
|
||||
|
||||
prefix:
|
||||
# Optional prefix for release tags.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ""
|
||||
```
|
||||
@@ -1,46 +1,168 @@
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
|
||||
name: 'Release'
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Do Monthly Release
|
||||
description: 'Creates a release for the current month, incrementing patch number if necessary.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
on:
|
||||
workflow_dispatch:
|
||||
workflow_call:
|
||||
schedule:
|
||||
- cron: '0 0 1 * *' # 1st of every month at midnight
|
||||
branding:
|
||||
icon: calendar
|
||||
color: blue
|
||||
|
||||
jobs:
|
||||
release:
|
||||
name: Release
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
inputs:
|
||||
token:
|
||||
description: 'GitHub token with permission to create releases.'
|
||||
required: true
|
||||
dry-run:
|
||||
description: 'Run in dry-run mode without creating the release.'
|
||||
required: false
|
||||
default: 'false'
|
||||
prefix:
|
||||
description: 'Optional prefix for release tags.'
|
||||
required: false
|
||||
default: ''
|
||||
|
||||
outputs:
|
||||
release-tag:
|
||||
description: 'The tag of the created release'
|
||||
value: ${{ steps.create-release.outputs.release_tag }}
|
||||
release-url:
|
||||
description: 'The URL of the created release'
|
||||
value: ${{ steps.create-release.outputs.release_url }}
|
||||
previous-tag:
|
||||
description: 'The previous release tag'
|
||||
value: ${{ steps.create-release.outputs.previous_tag }}
|
||||
|
||||
runs:
|
||||
using: 'composite'
|
||||
steps:
|
||||
- name: Checkout
|
||||
- name: Validate Inputs
|
||||
shell: bash
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Validate token
|
||||
if [ -z "${{ inputs.token }}" ]; then
|
||||
echo "::error::GitHub token is required"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate dry-run option
|
||||
if [ "${{ inputs.dry-run }}" != "true" ] && [ "${{ inputs.dry-run }}" != "false" ]; then
|
||||
echo "::error::dry-run must be either 'true' or 'false'"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Validate prefix format if provided
|
||||
if [ -n "${{ inputs.prefix }}" ]; then
|
||||
if ! [[ "${{ inputs.prefix }}" =~ ^[a-zA-Z0-9_.-]*$ ]]; then
|
||||
echo "::error::Invalid prefix format. Only alphanumeric characters, dots, underscores, and hyphens are allowed"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
- name: Checkout Repository
|
||||
uses: actions/checkout@v4
|
||||
with:
|
||||
fetch-depth: 0 # Fetch all history for tag comparison
|
||||
|
||||
- name: Create Release
|
||||
id: create-release
|
||||
shell: bash
|
||||
env:
|
||||
GITHUB_TOKEN: '${{ secrets.GITHUB_TOKEN }}'
|
||||
GITHUB_TOKEN: '${{ inputs.token }}'
|
||||
run: |
|
||||
# Retrieve previous release tag
|
||||
previous_tag="$(gh release list --limit 1 | awk '{ print $1 }')"
|
||||
set -euo pipefail
|
||||
|
||||
# Function to validate version format
|
||||
validate_version() {
|
||||
local version=$1
|
||||
if ! [[ $version =~ ^[0-9]{4}\.[0-9]{1,2}\.[0-9]+$ ]]; then
|
||||
echo "::error::Invalid version format: $version"
|
||||
return 1
|
||||
fi
|
||||
}
|
||||
|
||||
# Function to get previous release tag with error handling
|
||||
get_previous_tag() {
|
||||
local previous_tag
|
||||
previous_tag=$(gh release list --limit 1 2>/dev/null | awk '{ print $1 }') || {
|
||||
echo "::warning::No previous releases found"
|
||||
return 1
|
||||
}
|
||||
echo "$previous_tag"
|
||||
}
|
||||
|
||||
# Get previous release tag
|
||||
previous_tag=$(get_previous_tag) || previous_tag=""
|
||||
echo "previous_tag=${previous_tag}" >> $GITHUB_OUTPUT
|
||||
|
||||
if [ -n "$previous_tag" ]; then
|
||||
previous_major="${previous_tag%%\.*}"
|
||||
previous_minor="${previous_tag#*.}"
|
||||
previous_minor="${previous_minor%.*}"
|
||||
previous_patch="${previous_tag##*.}"
|
||||
|
||||
# Validate previous tag format
|
||||
validate_version "$previous_tag" || {
|
||||
echo "::error::Invalid previous tag format: $previous_tag"
|
||||
exit 1
|
||||
}
|
||||
fi
|
||||
|
||||
# Determine next release tag
|
||||
next_major_minor="$(date +'%Y').$(date +'%-m')"
|
||||
if [[ "${previous_major}.${previous_minor}" == "${next_major_minor}" ]]; then
|
||||
|
||||
if [ -n "$previous_tag" ] && [[ "${previous_major}.${previous_minor}" == "${next_major_minor}" ]]; then
|
||||
echo "Month release already exists for year, incrementing patch number by 1"
|
||||
next_patch="$((previous_patch + 1))"
|
||||
else
|
||||
echo "Month release does not exist for year, setting patch number to 0"
|
||||
next_patch="0"
|
||||
fi
|
||||
# Create release
|
||||
|
||||
# Construct release tag
|
||||
release_tag="${next_major_minor}.${next_patch}"
|
||||
gh release create "${release_tag}" \
|
||||
if [ -n "${{ inputs.prefix }}" ]; then
|
||||
release_tag="${{ inputs.prefix }}${release_tag}"
|
||||
fi
|
||||
|
||||
# Validate final release tag
|
||||
validate_version "${release_tag#${{ inputs.prefix }}}" || {
|
||||
echo "::error::Invalid release tag format: $release_tag"
|
||||
exit 1
|
||||
}
|
||||
|
||||
echo "release_tag=${release_tag}" >> $GITHUB_OUTPUT
|
||||
|
||||
# Create release if not in dry-run mode
|
||||
if [ "${{ inputs.dry-run }}" = "false" ]; then
|
||||
echo "Creating release ${release_tag}..."
|
||||
release_url=$(gh release create "${release_tag}" \
|
||||
--repo="${GITHUB_REPOSITORY}" \
|
||||
--title="${release_tag}" \
|
||||
--generate-notes
|
||||
--generate-notes 2>/dev/null) || {
|
||||
echo "::error::Failed to create release"
|
||||
exit 1
|
||||
}
|
||||
echo "release_url=${release_url}" >> $GITHUB_OUTPUT
|
||||
echo "Release created successfully: ${release_url}"
|
||||
else
|
||||
echo "Dry run mode - would create release: ${release_tag}"
|
||||
echo "release_url=dry-run" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
|
||||
- name: Verify Release
|
||||
if: inputs.dry-run == 'false'
|
||||
shell: bash
|
||||
env:
|
||||
GITHUB_TOKEN: '${{ inputs.token }}'
|
||||
run: |
|
||||
set -euo pipefail
|
||||
|
||||
# Verify the release was created
|
||||
if ! gh release view "${{ steps.create-release.outputs.release_tag }}" &>/dev/null; then
|
||||
echo "::error::Failed to verify release creation"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
echo "Release verification successful"
|
||||
|
||||
64
run.sh
Executable file
64
run.sh
Executable file
@@ -0,0 +1,64 @@
|
||||
#!/usr/bin/env bash
|
||||
|
||||
# Check if the OS is macOS or Linux
|
||||
if [[ $OSTYPE == "darwin"* ]]; then
|
||||
# macOS needs -i .bak because it doesn't support -i without arguments
|
||||
SED_CMD="sed -i .bak"
|
||||
else
|
||||
# Linux supports -i without arguments
|
||||
SED_CMD="sed -i"
|
||||
fi
|
||||
|
||||
find . -mindepth 1 -maxdepth 1 -type d | while read -r dir; do
|
||||
dir=${dir#./}
|
||||
action="./$dir/action.yml"
|
||||
|
||||
# if action doesn't exist, skip
|
||||
if [ ! -f "$action" ]; then
|
||||
continue
|
||||
fi
|
||||
|
||||
repo="ivuorinen/actions/$dir"
|
||||
readme="./$dir/README.md"
|
||||
version=$(grep -E '^# version:' "$action" | cut -d ' ' -f 2)
|
||||
|
||||
# if version doesn't exist, use 'main'
|
||||
if [ -z "$version" ]; then
|
||||
version="main"
|
||||
fi
|
||||
|
||||
echo "Updating $readme..."
|
||||
|
||||
printf "# %s\n\n" "$repo" >"$readme"
|
||||
|
||||
echo "- Generating action documentation..."
|
||||
npx --yes action-docs@latest \
|
||||
--source="$action" \
|
||||
--no-banner \
|
||||
--include-name-header >>"$readme"
|
||||
|
||||
echo "- Replacing placeholders in $readme..."
|
||||
$SED_CMD "s|PROJECT|$repo|g; s|VERSION|$version|g; s|\*\*\*||g" "$readme"
|
||||
|
||||
if [ -f "$readme.bak" ]; then
|
||||
rm "$readme.bak"
|
||||
echo "- Removed $readme.bak"
|
||||
fi
|
||||
done
|
||||
echo ""
|
||||
|
||||
echo "Running markdownlint..."
|
||||
npx --yes markdownlint-cli --fix \
|
||||
--ignore "**/node_modules/**" "**/README.md"
|
||||
echo ""
|
||||
|
||||
echo "Running prettier..."
|
||||
npx --yes prettier --write \
|
||||
"run.sh", "**/README.md" "**/action.yml" ".github/workflows/*.yml"
|
||||
echo ""
|
||||
|
||||
echo "Running MegaLinter..."
|
||||
npx --yes mega-linter-runner
|
||||
echo ""
|
||||
|
||||
echo "Done!"
|
||||
51
set-git-config/README.md
Normal file
51
set-git-config/README.md
Normal file
@@ -0,0 +1,51 @@
|
||||
# ivuorinen/actions/set-git-config
|
||||
|
||||
## Set Git Config
|
||||
|
||||
### Description
|
||||
|
||||
Sets Git configuration for actions.
|
||||
|
||||
### Inputs
|
||||
|
||||
| name | description | required | default |
|
||||
| ---------- | ----------------------------------- | -------- | ----------------------------- |
|
||||
| `token` | <p>GitHub token.</p> | `false` | `${{ secrets.GITHUB_TOKEN }}` |
|
||||
| `username` | <p>GitHub username for commits.</p> | `false` | `github-actions` |
|
||||
| `email` | <p>GitHub email for commits.</p> | `false` | `github-actions@github.com` |
|
||||
|
||||
### Outputs
|
||||
|
||||
| name | description |
|
||||
| ---------- | ----------------------------------- |
|
||||
| `token` | <p>GitHub token.</p> |
|
||||
| `username` | <p>GitHub username for commits.</p> |
|
||||
| `email` | <p>GitHub email for commits.</p> |
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/set-git-config@main
|
||||
with:
|
||||
token:
|
||||
# GitHub token.
|
||||
#
|
||||
# Required: false
|
||||
# Default: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
username:
|
||||
# GitHub username for commits.
|
||||
#
|
||||
# Required: false
|
||||
# Default: github-actions
|
||||
|
||||
email:
|
||||
# GitHub email for commits.
|
||||
#
|
||||
# Required: false
|
||||
# Default: github-actions@github.com
|
||||
```
|
||||
@@ -1,31 +1,75 @@
|
||||
name: set-git-config
|
||||
description: "Sets git config for the action"
|
||||
---
|
||||
# yaml-language-server: $schema=https://json.schemastore.org/github-action.json
|
||||
name: Set Git Config
|
||||
description: 'Sets Git configuration for actions.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: settings
|
||||
icon: git-commit
|
||||
color: gray-dark
|
||||
|
||||
inputs:
|
||||
token:
|
||||
description: "GitHub token"
|
||||
required: true
|
||||
description: 'GitHub token.'
|
||||
required: false
|
||||
default: '${{ secrets.GITHUB_TOKEN }}'
|
||||
username:
|
||||
description: "GitHub username action should use"
|
||||
default: "github-actions"
|
||||
format: string
|
||||
description: 'GitHub username for commits.'
|
||||
default: 'github-actions'
|
||||
email:
|
||||
description: "GitHub email action should use"
|
||||
default: "github-actions@github.com"
|
||||
format: email
|
||||
description: 'GitHub email for commits.'
|
||||
default: 'github-actions@github.com'
|
||||
|
||||
outputs:
|
||||
token:
|
||||
description: 'GitHub token.'
|
||||
value: ${{ steps.bot.outputs.token }}
|
||||
username:
|
||||
description: 'GitHub username for commits.'
|
||||
value: ${{ steps.bot.outputs.username }}
|
||||
email:
|
||||
description: 'GitHub email for commits.'
|
||||
value: ${{ steps.bot.outputs.email }}
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: Set git config
|
||||
- name: Check for FIXIMUS_TOKEN
|
||||
id: bot
|
||||
run: |
|
||||
git config --local --unset-all http.https://github.com/.extraheader || true
|
||||
git config --global --add url.https://x-access-token:${{ inputs.token }}@github.com/.insteadOf 'https://github.com/'
|
||||
git config --global --add url.https://x-access-token:${{ inputs.token }}@github.com/.insteadOf 'git@github.com:'
|
||||
git config --global user.name ${{ inputs.username }}
|
||||
git config --global user.email ${{ inputs.email }}
|
||||
if [ -n "${{ secrets.FIXIMUS_TOKEN }}" ]; then
|
||||
echo "token=${{ secrets.FIXIMUS_TOKEN }}" >> $GITHUB_OUTPUT
|
||||
echo "username=fiximus" >> $GITHUB_OUTPUT
|
||||
echo "email=github-bot@ivuorinen.net" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "token=${{ inputs.token }}" >> $GITHUB_OUTPUT
|
||||
echo "username=${{ inputs.username }}" >> $GITHUB_OUTPUT
|
||||
echo "email=${{ inputs.email }}" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
shell: bash
|
||||
|
||||
- name: Configure Git
|
||||
run: |
|
||||
# Function to clean up Git config
|
||||
cleanup_git_config() {
|
||||
git config --local --unset-all "url.https://x-access-token:${TOKEN}@github.com/.insteadof" || true
|
||||
git config --local --unset-all "user.name" || true
|
||||
git config --local --unset-all "user.email" || true
|
||||
}
|
||||
|
||||
# Set up trap to ensure cleanup on exit
|
||||
trap cleanup_git_config EXIT
|
||||
|
||||
# Store token in variable to avoid repeated exposure
|
||||
TOKEN="${{ steps.bot.outputs.token }}"
|
||||
|
||||
git config --local --unset-all http.https://github.com/.extraheader || true
|
||||
git config --local \
|
||||
--add url.https://x-access-token:${{ steps.bot.outputs.token }}@github.com/.insteadOf \
|
||||
"https://github.com/"
|
||||
git config --local \
|
||||
--add url.https://x-access-token:${{ steps.bot.outputs.token }}@github.com/.insteadOf \
|
||||
'git@github.com:'
|
||||
git config --local user.name "${{ steps.bot.outputs.username }}"
|
||||
git config --local user.email "${{ steps.bot.outputs.email }}"
|
||||
shell: bash
|
||||
|
||||
8
sonar-project.properties
Normal file
8
sonar-project.properties
Normal file
@@ -0,0 +1,8 @@
|
||||
sonar.projectKey=ivuorinen_actions
|
||||
sonar.organization=ivuorinen
|
||||
|
||||
sonar.sources=.
|
||||
sonar.exclusions=**/node_modules/**,**/dist/**,**/coverage/**,**/.github/**
|
||||
sonar.test.inclusions=**/*.test.js,**/*.test.ts
|
||||
sonar.javascript.lcov.reportPaths=coverage/lcov.info
|
||||
sonar.sourceEncoding=UTF-8
|
||||
17
stale/README.md
Normal file
17
stale/README.md
Normal file
@@ -0,0 +1,17 @@
|
||||
# ivuorinen/actions/stale
|
||||
|
||||
## Stale
|
||||
|
||||
### Description
|
||||
|
||||
A GitHub Action to close stale issues and pull requests.
|
||||
|
||||
### Runs
|
||||
|
||||
This action is a `composite` action.
|
||||
|
||||
### Usage
|
||||
|
||||
```yaml
|
||||
- uses: ivuorinen/actions/stale@main
|
||||
```
|
||||
38
stale/action.yml
Normal file
38
stale/action.yml
Normal file
@@ -0,0 +1,38 @@
|
||||
---
|
||||
name: Stale
|
||||
description: 'A GitHub Action to close stale issues and pull requests.'
|
||||
author: 'Ismo Vuorinen'
|
||||
|
||||
branding:
|
||||
icon: clock
|
||||
color: yellow
|
||||
|
||||
runs:
|
||||
using: composite
|
||||
steps:
|
||||
- name: 🚀 Run stale
|
||||
uses: actions/stale@v9.1.0
|
||||
with:
|
||||
repo-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
days-before-stale: 30
|
||||
days-before-close: 7
|
||||
remove-stale-when-updated: true
|
||||
stale-issue-label: 'stale'
|
||||
exempt-issue-labels: 'no-stale,help-wanted'
|
||||
stale-issue-message: >
|
||||
There hasn't been any activity on this issue recently, so we
|
||||
clean up some of the older and inactive issues.
|
||||
|
||||
Please make sure to update to the latest version and
|
||||
check if that solves the issue. Let us know if that works for you
|
||||
by leaving a comment 👍
|
||||
|
||||
This issue has now been marked as stale and will be closed if no
|
||||
further activity occurs. Thanks!
|
||||
stale-pr-label: 'stale'
|
||||
exempt-pr-labels: 'no-stale'
|
||||
stale-pr-message: >
|
||||
There hasn't been any activity on this pull request recently. This
|
||||
pull request has been automatically marked as stale because of that
|
||||
and will be closed if no further activity occurs within 7 days.
|
||||
Thank you for your contributions.
|
||||
22
supressions.xml
Normal file
22
supressions.xml
Normal file
@@ -0,0 +1,22 @@
|
||||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<suppressions xmlns="https://jeremylong.github.io/DependencyCheck/dependency-suppression.1.3.xsd">
|
||||
<!-- Example suppression for a specific CVE -->
|
||||
<suppress>
|
||||
<notes>Description of why this vulnerability is suppressed</notes>
|
||||
<cve>CVE-2023-12345</cve>
|
||||
</suppress>
|
||||
|
||||
<!-- Example suppression for a specific package -->
|
||||
<suppress>
|
||||
<notes>Package is only used in development</notes>
|
||||
<packageUrl regex="true">^pkg:npm/dev\-dependency@.*$</packageUrl>
|
||||
<vulnerabilityName regex="true">.*</vulnerabilityName>
|
||||
</suppress>
|
||||
|
||||
<!-- Example suppression based on CVSS score -->
|
||||
<suppress>
|
||||
<notes>Low severity issues in test dependencies</notes>
|
||||
<cvssBelow>4.0</cvssBelow>
|
||||
<packageUrl regex="true">^pkg:npm/test\-.*$</packageUrl>
|
||||
</suppress>
|
||||
</suppressions>
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user