# 工作流名称:迁移审查 name: Migration Review # 触发条件:当目标分支的PR被打开,且修改了特定迁移相关路径时触发 on: pull_request_target: types: [opened] # 仅在PR被打开时触发 paths: - 'ghost/core/core/server/data/schema/**' # 监控schema目录下的所有文件变更 - 'ghost/core/core/server/data/migrations/versions/**' # 监控迁移版本目录下的所有文件变更 jobs: createComment: # 定义一个名为createComment的任务 runs-on: ubuntu-latest # 任务运行在最新版Ubuntu系统上 if: github.repository_owner == 'TryGhost' # 仅当仓库所有者为TryGhost时执行 name: Add migration review requirements # 任务名称:添加迁移审查要求 steps: # 步骤1:为PR添加"migration"标签 - uses: actions/github-script@v7 # 使用GitHub官方脚本动作 with: script: | # 调用GitHub API为当前PR添加标签 github.rest.issues.addLabels({ issue_number: context.issue.number, # 获取当前PR的编号 owner: context.repo.owner, # 仓库所有者(从上下文获取) repo: context.repo.repo, # 仓库名称(从上下文获取) labels: ["migration"] # 要添加的标签 }) # 步骤2:在PR中添加迁移审查清单评论 - uses: peter-evans/create-or-update-comment@ac8e6509d7545ebc2e5e7c35eaa12195c2f77adc with: issue-number: ${{ github.event.pull_request.number }} # 指定要评论的PR编号 body: | # 评论内容(迁移审查清单) It looks like this PR contains a migration 👀 Here's the checklist for reviewing migrations: ### General requirements - [ ] :warning: Tested performance on staging database servers, as performance on local machines is not comparable to a production environment - [ ] Satisfies idempotency requirement (both `up()` and `down()`) - [ ] Does not reference models - [ ] Filename is in the correct format (and correctly ordered) - [ ] Targets the next minor version - [ ] All code paths have appropriate log messages - [ ] Uses the correct utils - [ ] Contains a minimal changeset - [ ] Does not mix DDL/DML operations - [ ] Tested in MySQL and SQLite ### Schema changes - [ ] Both schema change and related migration have been implemented - [ ] For index changes: has been performance tested for large tables - [ ] For new tables/columns: fields use the appropriate predefined field lengths - [ ] For new tables/columns: field names follow the appropriate conventions - [ ] Does not drop a non-alpha table outside of a major version ### Data changes - [ ] Mass updates/inserts are batched appropriately - [ ] Does not loop over large tables/datasets - [ ] Defends against missing or invalid data - [ ] For settings updates: follows the appropriate guidelines