Compare commits
13 Commits
386e039ff6
...
feature/is
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d912471d0e | ||
| 5f337e2ebc | |||
|
|
4a404875a9 | ||
| cdd3814781 | |||
|
|
dbf57b5068 | ||
| 7efc254ab5 | |||
|
|
2742628b78 | ||
| d60fd8947b | |||
|
|
694d73b212 | ||
|
|
b2b02b6f57 | ||
| 2dbe98615d | |||
|
|
34cf081c96 | ||
|
|
7bc4e88335 |
41
.gitea/ISSUE_TEMPLATE/runtime_verification.md
Normal file
41
.gitea/ISSUE_TEMPLATE/runtime_verification.md
Normal file
@@ -0,0 +1,41 @@
|
||||
---
|
||||
name: Runtime Verification Incident
|
||||
about: 실운영/스테이징 동작 검증 중 발견된 이상 징후 등록
|
||||
title: "[RUNTIME-VERIFY][SCN-XXX] "
|
||||
labels: runtime, verification
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
- 현상:
|
||||
- 최초 관측 시각(UTC):
|
||||
|
||||
## Reproduction / Observation
|
||||
|
||||
- 실행 모드(`live`/`paper`):
|
||||
- 세션(`NXT`, `US_PRE`, `US_DAY`, `US_AFTER`, ...):
|
||||
- 실행 커맨드:
|
||||
- 로그 경로:
|
||||
|
||||
## Expected vs Actual
|
||||
|
||||
- Expected:
|
||||
- Actual:
|
||||
|
||||
## Requirement Mapping
|
||||
|
||||
- REQ:
|
||||
- TASK:
|
||||
- TEST:
|
||||
|
||||
## Temporary Mitigation
|
||||
|
||||
- 즉시 완화책:
|
||||
|
||||
## Close Criteria
|
||||
|
||||
- [ ] Dev 수정 반영
|
||||
- [ ] Verifier 재검증 PASS
|
||||
- [ ] Runtime Verifier 재관측 PASS
|
||||
- [ ] `NOT_OBSERVED = 0`
|
||||
|
||||
47
.gitea/PULL_REQUEST_TEMPLATE.md
Normal file
47
.gitea/PULL_REQUEST_TEMPLATE.md
Normal file
@@ -0,0 +1,47 @@
|
||||
## Linked Issue
|
||||
|
||||
- Closes #N
|
||||
|
||||
## Scope
|
||||
|
||||
- REQ: `REQ-...`
|
||||
- TASK: `TASK-...`
|
||||
- TEST: `TEST-...`
|
||||
|
||||
## Ticket Stage
|
||||
|
||||
- Current stage: `Implemented` / `Integrated` / `Observed` / `Accepted`
|
||||
- Previous stage evidence link:
|
||||
|
||||
## Main -> Verifier Directive Contract
|
||||
|
||||
- Scope: 대상 요구사항/코드/로그 경로
|
||||
- Method: 실행 커맨드 + 관측 포인트
|
||||
- PASS criteria:
|
||||
- FAIL criteria:
|
||||
- NOT_OBSERVED criteria:
|
||||
- Evidence format: PR 코멘트 `Coverage Matrix`
|
||||
|
||||
## Verifier Coverage Matrix (Required)
|
||||
|
||||
| Item | Evidence | Status (PASS/FAIL/NOT_OBSERVED) |
|
||||
|---|---|---|
|
||||
| REQ-... | 링크/로그 | PASS |
|
||||
|
||||
`NOT_OBSERVED`가 1개라도 있으면 승인/머지 금지.
|
||||
|
||||
## Gitea Preflight
|
||||
|
||||
- [ ] `docs/commands.md`와 `docs/workflow.md` 트러블슈팅 선확인
|
||||
- [ ] `tea` 사용 (`gh` 미사용)
|
||||
|
||||
## Runtime Evidence
|
||||
|
||||
- 시스템 실제 구동 커맨드:
|
||||
- 모니터링 로그 경로:
|
||||
- 이상 징후/이슈 링크:
|
||||
|
||||
## Approval Gate
|
||||
|
||||
- [ ] Static Verifier approval comment linked
|
||||
- [ ] Runtime Verifier approval comment linked
|
||||
6
.github/workflows/ci.yml
vendored
6
.github/workflows/ci.yml
vendored
@@ -21,6 +21,12 @@ jobs:
|
||||
- name: Install dependencies
|
||||
run: pip install ".[dev]"
|
||||
|
||||
- name: Validate governance assets
|
||||
run: python3 scripts/validate_governance_assets.py
|
||||
|
||||
- name: Validate Ouroboros docs
|
||||
run: python3 scripts/validate_ouroboros_docs.py
|
||||
|
||||
- name: Lint
|
||||
run: ruff check src/ tests/
|
||||
|
||||
|
||||
@@ -12,6 +12,8 @@ It is distinct from `docs/requirements-log.md`, which records **project/product
|
||||
|
||||
1. **Workflow enforcement**
|
||||
- Follow `docs/workflow.md` for all changes.
|
||||
- Before any Gitea issue/PR/comment operation, read `docs/commands.md` and `docs/workflow.md` troubleshooting section.
|
||||
- Use `tea` for Gitea operations; do not use GitHub CLI (`gh`) in this repository workflow.
|
||||
- Create a Gitea issue before any code or documentation change.
|
||||
- Work on a feature branch `feature/issue-{N}-{short-description}` and open a PR.
|
||||
- Never commit directly to `main`.
|
||||
@@ -43,3 +45,8 @@ It is distinct from `docs/requirements-log.md`, which records **project/product
|
||||
- When work requires guidance, consult the relevant `docs/` policies first.
|
||||
- Any code change must be accompanied by relevant documentation updates.
|
||||
- Persist user constraints across sessions by recording them in this document.
|
||||
|
||||
### 2026-02-27
|
||||
|
||||
- All agents must pre-read `docs/commands.md` and `docs/workflow.md` troubleshooting before running Gitea issue/PR/comment commands.
|
||||
- `gh` CLI is prohibited for repository ticket/PR operations; use `tea` (or documented Gitea API fallback only).
|
||||
|
||||
@@ -4,6 +4,13 @@
|
||||
|
||||
**Critical: Learn from failures. Never repeat the same failed command without modification.**
|
||||
|
||||
## Repository VCS Rule (Mandatory)
|
||||
|
||||
- 이 저장소의 티켓/PR/코멘트 작업은 Gitea 기준으로 수행한다.
|
||||
- `gh`(GitHub CLI) 명령 사용은 금지한다.
|
||||
- 기본 도구는 `tea`이며, `tea` 미지원 케이스만 Gitea API를 fallback으로 사용한다.
|
||||
- 실행 전 `docs/workflow.md`의 `Gitea CLI Formatting Troubleshooting`을 반드시 확인한다.
|
||||
|
||||
### tea CLI (Gitea Command Line Tool)
|
||||
|
||||
#### ❌ TTY Error - Interactive Confirmation Fails
|
||||
@@ -140,6 +147,12 @@ python -m src.main --mode=paper
|
||||
# Run with dashboard enabled
|
||||
python -m src.main --mode=paper --dashboard
|
||||
|
||||
# Runtime verification monitor (NOT_OBSERVED detection)
|
||||
bash scripts/runtime_verify_monitor.sh
|
||||
|
||||
# Follow runtime verification log
|
||||
tail -f data/overnight/runtime_verify_*.log
|
||||
|
||||
# Docker
|
||||
docker compose up -d ouroboros # Run agent
|
||||
docker compose --profile test up test # Run tests in container
|
||||
|
||||
@@ -34,6 +34,12 @@ Main Agent 아이디에이션 책임:
|
||||
- DCP-03 구현 착수: Phase 2 종료 전 Main Agent 승인 필수
|
||||
- DCP-04 배포 승인: Phase 4 종료 후 Main Agent 최종 승인 필수
|
||||
|
||||
Main/Verifier 사고 재발 방지 규칙:
|
||||
- Main Agent는 검증 위임 시 `Directive Contract`를 충족하지 않으면 검증 착수 금지
|
||||
- Verifier Agent는 지시 누락/모호성 발견 시 즉시 `BLOCKED`를 선언하고 보완 요청
|
||||
- Verifier Agent는 `미관측(NOT_OBSERVED)` 항목을 PASS로 보고할 수 없다
|
||||
- Runtime 검증에서 요구 세션 증적이 없으면 "정상"이 아니라 `미검증 이상`으로 이슈화한다
|
||||
|
||||
## Phase Control Gates
|
||||
|
||||
### Phase 0: Scenario Intake and Scope Lock
|
||||
@@ -112,7 +118,10 @@ Exit criteria:
|
||||
|
||||
Control checks:
|
||||
- Verifier가 테스트 증적(로그/리포트/실행 커맨드) 첨부
|
||||
- Verifier가 `Coverage Matrix`(`REQ/TASK/TEST` x `PASS/FAIL/NOT_OBSERVED`) 첨부
|
||||
- `NOT_OBSERVED` 항목 수가 0인지 확인(0이 아니면 Gate 실패)
|
||||
- Runtime Verifier가 스테이징/실운영 모니터링 계획 승인
|
||||
- 정적 Verifier 승인 + Runtime Verifier 승인 2개 모두 확인
|
||||
- 산출물: 수용 승인 레코드
|
||||
|
||||
### Phase 5: Release and Post-Release Control
|
||||
@@ -150,6 +159,17 @@ TPM 티켓 운영 규칙:
|
||||
- PR 본문에는 TPM이 지정한 우선순위와 범위가 그대로 반영되어야 한다.
|
||||
- 우선순위 변경은 TPM 제안 + Main Agent 승인으로만 가능하다.
|
||||
- PM/TPM/Dev/Reviewer/Verifier/Runtime Verifier는 주요 의사결정 시점마다 PR 코멘트를 남겨 결정 근거를 추적 가능 상태로 유지한다.
|
||||
- PM/TPM/Dev/Reviewer/Verifier/Runtime Verifier는 이슈/PR/코멘트 조작 전에 `docs/commands.md`와 `docs/workflow.md`의 Gitea 트러블슈팅 섹션을 선참조해야 한다.
|
||||
- 저장소 협업에서 GitHub CLI(`gh`) 사용은 금지하며, Gitea 작업은 `tea`(필요 시 문서화된 API fallback)만 허용한다.
|
||||
- 재발 방지/운영 규칙 변경이 합의되면, 기능 구현 이전에 process 티켓을 먼저 생성/머지해야 한다.
|
||||
- process 티켓 미반영 상태에서 구현 티켓 진행 시 TPM이 즉시 `BLOCKED` 처리한다.
|
||||
|
||||
티켓 성숙도 단계 (Mandatory):
|
||||
- `Implemented`: 코드/문서 변경 완료
|
||||
- `Integrated`: 호출 경로/파이프라인 연결 확인
|
||||
- `Observed`: 런타임/실행 증적 확보
|
||||
- `Accepted`: Verifier + Runtime Verifier 승인 완료
|
||||
- 단계는 순차 전진만 허용되며, 단계 점프는 허용되지 않는다.
|
||||
|
||||
브랜치 운영 규칙:
|
||||
- TPM은 각 티켓에 대해 `ticket temp branch -> program feature branch` PR 경로를 지정한다.
|
||||
@@ -168,6 +188,8 @@ TPM 티켓 운영 규칙:
|
||||
- 시스템 실제 구동(스테이징/로컬 실운영 모드) 실행
|
||||
- 모니터링 체크리스트(핵심 경보/주문 경로/예외 로그) 수행
|
||||
- 결과를 티켓/PR 코멘트에 증적으로 첨부하지 않으면 완료로 간주하지 않음
|
||||
- 세션별 필수 관측 포인트(`NXT`, `US_PRE`, `US_DAY`, `US_AFTER` 등) 중 미관측 항목은 `NOT_OBSERVED`로 기록
|
||||
- `NOT_OBSERVED` 존재 시 승인 금지 + Runtime 이슈 발행
|
||||
|
||||
## Server Reflection Rule
|
||||
|
||||
|
||||
@@ -48,6 +48,8 @@ Updated: 2026-02-26
|
||||
병합 전 체크리스트:
|
||||
- 이슈 연결(`Closes #N`) 존재
|
||||
- PR 본문에 `REQ-*`, `TASK-*`, `TEST-*` 매핑 표 존재
|
||||
- Main -> Verifier Directive Contract(범위/방법/합격/실패/미관측/증적 형식) 기재
|
||||
- process-change-first 대상이면 process 티켓 PR이 선머지됨
|
||||
- `src/core/risk_manager.py` 변경 없음
|
||||
- 주요 의사결정 체크포인트(DCP-01~04) 중 해당 단계 Main Agent 확인 기록 존재
|
||||
- 주요 의사결정(리뷰 지적/수정 합의/검증 승인)에 대한 에이전트 PR 코멘트 존재
|
||||
@@ -57,6 +59,12 @@ Updated: 2026-02-26
|
||||
- 문서 검증 스크립트 통과
|
||||
- 테스트 통과
|
||||
- 개발 완료 시 시스템 구동/모니터링 증적 코멘트 존재
|
||||
- 이슈/PR 조작 전에 `docs/commands.md` 및 `docs/workflow.md` 트러블슈팅 확인 코멘트 존재
|
||||
- `gh` CLI 미사용, `tea` 사용 증적 존재
|
||||
- Verifier `Coverage Matrix` 첨부(PASS/FAIL/NOT_OBSERVED)
|
||||
- `NOT_OBSERVED` 항목 0 확인(0이 아니면 머지 금지)
|
||||
- 티켓 단계 기록(`Implemented` -> `Integrated` -> `Observed` -> `Accepted`) 존재
|
||||
- 정적 Verifier 승인 + Runtime Verifier 승인 2개 확인
|
||||
|
||||
## 5) 감사 추적
|
||||
|
||||
|
||||
@@ -16,6 +16,20 @@
|
||||
|
||||
**Never commit directly to `main`.** This policy applies to all changes, no exceptions.
|
||||
|
||||
## Agent Gitea Preflight (Mandatory)
|
||||
|
||||
Gitea 이슈/PR/코멘트 작업 전에 모든 에이전트는 아래를 먼저 확인해야 한다.
|
||||
|
||||
1. `docs/commands.md`의 `tea CLI` 실패 사례/해결 패턴 확인
|
||||
2. 본 문서의 `Gitea CLI Formatting Troubleshooting` 확인
|
||||
3. 명령 실행 전 `gh`(GitHub CLI) 사용 금지 확인
|
||||
|
||||
강제 규칙:
|
||||
- 이 저장소 협업 명령은 `tea`를 기본으로 사용한다.
|
||||
- `gh issue`, `gh pr` 등 GitHub CLI 명령은 사용 금지다.
|
||||
- `tea` 실패 시 동일 명령 재시도 전에 원인/수정사항을 PR 코멘트에 남긴다.
|
||||
- 필요한 경우에만 Gitea API(`localhost:3000`)를 fallback으로 사용한다.
|
||||
|
||||
## Branch Strategy (Mandatory)
|
||||
|
||||
- Team operation default branch is the **program feature branch**, not `main`.
|
||||
@@ -137,6 +151,44 @@ task_tool(
|
||||
|
||||
Use `run_in_background=True` for independent tasks that don't block subsequent work.
|
||||
|
||||
### Main -> Verifier Directive Contract (Mandatory)
|
||||
|
||||
메인 에이전트가 검증 에이전트에 작업을 위임할 때, 아래 6개를 누락하면 지시가 무효다.
|
||||
|
||||
1. 검증 대상 범위: `REQ-*`, `TASK-*`, 코드/로그 경로
|
||||
2. 검증 방법: 실행 커맨드와 관측 포인트(예: 세션별 로그 키워드)
|
||||
3. 합격 기준: PASS 조건을 수치/문구로 명시
|
||||
4. 실패 기준: FAIL 조건을 수치/문구로 명시
|
||||
5. 미관측 기준: `NOT_OBSERVED` 조건과 즉시 에스컬레이션 규칙
|
||||
6. 증적 형식: PR 코멘트에 `Coverage Matrix` 표로 제출
|
||||
|
||||
`NOT_OBSERVED` 처리 규칙:
|
||||
- 요구사항 항목이 관측되지 않았으면 PASS로 간주 금지
|
||||
- `NOT_OBSERVED`는 운영상 `FAIL`과 동일하게 처리
|
||||
- `NOT_OBSERVED`가 하나라도 있으면 승인/머지 금지
|
||||
|
||||
### Process-Change-First Rule (Mandatory)
|
||||
|
||||
재발 방지/운영 규칙 변경이 결정되면, 기능 구현 티켓보다 먼저 서버(feature branch)에 반영해야 한다.
|
||||
|
||||
- 순서: `process ticket merge` -> `implementation ticket start`
|
||||
- process ticket 미반영 상태에서 기능 티켓 코딩/머지 금지
|
||||
- 세션 전환 시에도 동일 규칙 유지
|
||||
|
||||
### Ticket Maturity Stages (Mandatory)
|
||||
|
||||
모든 티켓은 아래 4단계를 순서대로 통과해야 한다.
|
||||
|
||||
1. `Implemented`: 코드/문서 변경 완료
|
||||
2. `Integrated`: 호출 경로/파이프라인 연결 완료
|
||||
3. `Observed`: 런타임/실행 증적 확보 완료
|
||||
4. `Accepted`: 정적 Verifier + Runtime Verifier 승인 완료
|
||||
|
||||
강제 규칙:
|
||||
- 단계 점프 금지 (예: Implemented -> Accepted 금지)
|
||||
- `Observed` 전에는 완료 선언 금지
|
||||
- `Accepted` 전에는 머지 금지
|
||||
|
||||
## Code Review Checklist
|
||||
|
||||
**CRITICAL: Every PR review MUST verify plan-implementation consistency.**
|
||||
@@ -170,3 +222,10 @@ Before approving any PR, the reviewer (human or agent) must check ALL of the fol
|
||||
- [ ] PR references the Gitea issue number
|
||||
- [ ] Feature branch follows naming convention (`feature/issue-N-description`)
|
||||
- [ ] Commit messages are clear and descriptive
|
||||
- [ ] 이슈/PR 작업 전에 `docs/commands.md`와 본 문서 트러블슈팅 섹션을 확인했다
|
||||
- [ ] `gh` 명령을 사용하지 않고 `tea`(또는 허용된 Gitea API fallback)만 사용했다
|
||||
- [ ] Main -> Verifier 지시가 Directive Contract 6개 항목을 모두 포함한다
|
||||
- [ ] Verifier 결과에 `Coverage Matrix`(PASS/FAIL/NOT_OBSERVED)가 있고, `NOT_OBSERVED=0`이다
|
||||
- [ ] Process-change-first 대상이면 해당 process PR이 먼저 머지되었다
|
||||
- [ ] 티켓 단계가 `Implemented -> Integrated -> Observed -> Accepted` 순서로 기록되었다
|
||||
- [ ] 정적 Verifier와 Runtime Verifier 승인 코멘트가 모두 존재한다
|
||||
|
||||
78
scripts/runtime_verify_monitor.sh
Executable file
78
scripts/runtime_verify_monitor.sh
Executable file
@@ -0,0 +1,78 @@
|
||||
#!/usr/bin/env bash
|
||||
# Runtime verification monitor with NOT_OBSERVED detection.
|
||||
|
||||
set -euo pipefail
|
||||
|
||||
ROOT_DIR="${ROOT_DIR:-/home/agentson/repos/The-Ouroboros}"
|
||||
LOG_DIR="${LOG_DIR:-$ROOT_DIR/data/overnight}"
|
||||
INTERVAL_SEC="${INTERVAL_SEC:-60}"
|
||||
MAX_HOURS="${MAX_HOURS:-24}"
|
||||
|
||||
cd "$ROOT_DIR"
|
||||
|
||||
OUT_LOG="$LOG_DIR/runtime_verify_$(date +%Y%m%d_%H%M%S).log"
|
||||
END_TS=$(( $(date +%s) + MAX_HOURS*3600 ))
|
||||
|
||||
log() {
|
||||
printf '%s %s\n' "$(date -u +%Y-%m-%dT%H:%M:%SZ)" "$1" | tee -a "$OUT_LOG" >/dev/null
|
||||
}
|
||||
|
||||
check_signal() {
|
||||
local name="$1"
|
||||
local pattern="$2"
|
||||
local run_log="$3"
|
||||
|
||||
if rg -q "$pattern" "$run_log"; then
|
||||
log "[COVERAGE] ${name}=PASS pattern=${pattern}"
|
||||
return 0
|
||||
fi
|
||||
log "[COVERAGE] ${name}=NOT_OBSERVED pattern=${pattern}"
|
||||
return 1
|
||||
}
|
||||
|
||||
log "[INFO] runtime verify monitor started interval=${INTERVAL_SEC}s max_hours=${MAX_HOURS}"
|
||||
|
||||
while true; do
|
||||
now=$(date +%s)
|
||||
if [ "$now" -ge "$END_TS" ]; then
|
||||
log "[INFO] monitor completed (time window reached)"
|
||||
exit 0
|
||||
fi
|
||||
|
||||
latest_run="$(ls -t "$LOG_DIR"/run_*.log 2>/dev/null | head -n1 || true)"
|
||||
if [ -z "$latest_run" ]; then
|
||||
log "[ANOMALY] no run log found"
|
||||
sleep "$INTERVAL_SEC"
|
||||
continue
|
||||
fi
|
||||
|
||||
# Basic liveness hints.
|
||||
app_pid="$(cat "$LOG_DIR/app.pid" 2>/dev/null || true)"
|
||||
wd_pid="$(cat "$LOG_DIR/watchdog.pid" 2>/dev/null || true)"
|
||||
app_alive=0
|
||||
wd_alive=0
|
||||
port_alive=0
|
||||
[ -n "$app_pid" ] && kill -0 "$app_pid" 2>/dev/null && app_alive=1
|
||||
[ -n "$wd_pid" ] && kill -0 "$wd_pid" 2>/dev/null && wd_alive=1
|
||||
ss -ltnp 2>/dev/null | rg -q ':8080' && port_alive=1
|
||||
log "[HEARTBEAT] run_log=$latest_run app_alive=$app_alive watchdog_alive=$wd_alive port8080=$port_alive"
|
||||
|
||||
# Coverage matrix rows (session paths and policy gate evidence).
|
||||
not_observed=0
|
||||
check_signal "LIVE_MODE" "Mode: live" "$latest_run" || not_observed=$((not_observed+1))
|
||||
check_signal "KR_LOOP" "Processing market: Korea Exchange" "$latest_run" || not_observed=$((not_observed+1))
|
||||
check_signal "NXT_PATH" "NXT_PRE|NXT_AFTER|session=NXT_" "$latest_run" || not_observed=$((not_observed+1))
|
||||
check_signal "US_PRE_PATH" "US_PRE|session=US_PRE" "$latest_run" || not_observed=$((not_observed+1))
|
||||
check_signal "US_DAY_PATH" "US_DAY|session=US_DAY|Processing market: .*NASDAQ|Processing market: .*NYSE|Processing market: .*AMEX" "$latest_run" || not_observed=$((not_observed+1))
|
||||
check_signal "US_AFTER_PATH" "US_AFTER|session=US_AFTER" "$latest_run" || not_observed=$((not_observed+1))
|
||||
check_signal "ORDER_POLICY_SESSION" "Order policy rejected .*\\[session=" "$latest_run" || not_observed=$((not_observed+1))
|
||||
|
||||
if [ "$not_observed" -gt 0 ]; then
|
||||
log "[ANOMALY] coverage_not_observed=$not_observed (treat as FAIL)"
|
||||
else
|
||||
log "[OK] coverage complete (NOT_OBSERVED=0)"
|
||||
fi
|
||||
|
||||
sleep "$INTERVAL_SEC"
|
||||
done
|
||||
|
||||
61
scripts/validate_governance_assets.py
Normal file
61
scripts/validate_governance_assets.py
Normal file
@@ -0,0 +1,61 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Validate persistent governance assets for agent workflow safety."""
|
||||
|
||||
from __future__ import annotations
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
|
||||
def must_contain(path: Path, required: list[str], errors: list[str]) -> None:
|
||||
if not path.exists():
|
||||
errors.append(f"missing file: {path}")
|
||||
return
|
||||
text = path.read_text(encoding="utf-8")
|
||||
for token in required:
|
||||
if token not in text:
|
||||
errors.append(f"{path}: missing required token -> {token}")
|
||||
|
||||
|
||||
def main() -> int:
|
||||
errors: list[str] = []
|
||||
|
||||
pr_template = Path(".gitea/PULL_REQUEST_TEMPLATE.md")
|
||||
issue_template = Path(".gitea/ISSUE_TEMPLATE/runtime_verification.md")
|
||||
|
||||
must_contain(
|
||||
pr_template,
|
||||
[
|
||||
"Closes #N",
|
||||
"Main -> Verifier Directive Contract",
|
||||
"Coverage Matrix",
|
||||
"NOT_OBSERVED",
|
||||
"tea",
|
||||
"gh",
|
||||
],
|
||||
errors,
|
||||
)
|
||||
must_contain(
|
||||
issue_template,
|
||||
[
|
||||
"[RUNTIME-VERIFY][SCN-XXX]",
|
||||
"Requirement Mapping",
|
||||
"Close Criteria",
|
||||
"NOT_OBSERVED = 0",
|
||||
],
|
||||
errors,
|
||||
)
|
||||
|
||||
if errors:
|
||||
print("[FAIL] governance asset validation failed")
|
||||
for err in errors:
|
||||
print(f"- {err}")
|
||||
return 1
|
||||
|
||||
print("[OK] governance assets validated")
|
||||
return 0
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
|
||||
@@ -60,6 +60,7 @@ class Settings(BaseSettings):
|
||||
# This value is used as a fallback when the balance API returns 0 in paper mode.
|
||||
PAPER_OVERSEAS_CASH: float = Field(default=50000.0, ge=0.0)
|
||||
USD_BUFFER_MIN: float = Field(default=1000.0, ge=0.0)
|
||||
OVERNIGHT_EXCEPTION_ENABLED: bool = True
|
||||
|
||||
# Trading frequency mode (daily = batch API calls, realtime = per-stock calls)
|
||||
TRADE_MODE: str = Field(default="daily", pattern="^(daily|realtime)$")
|
||||
|
||||
72
src/db.py
72
src/db.py
@@ -31,8 +31,12 @@ def init_db(db_path: str) -> sqlite3.Connection:
|
||||
quantity INTEGER,
|
||||
price REAL,
|
||||
pnl REAL DEFAULT 0.0,
|
||||
strategy_pnl REAL DEFAULT 0.0,
|
||||
fx_pnl REAL DEFAULT 0.0,
|
||||
market TEXT DEFAULT 'KR',
|
||||
exchange_code TEXT DEFAULT 'KRX',
|
||||
session_id TEXT DEFAULT 'UNKNOWN',
|
||||
selection_context TEXT,
|
||||
decision_id TEXT,
|
||||
mode TEXT DEFAULT 'paper'
|
||||
)
|
||||
@@ -53,6 +57,32 @@ def init_db(db_path: str) -> sqlite3.Connection:
|
||||
conn.execute("ALTER TABLE trades ADD COLUMN decision_id TEXT")
|
||||
if "mode" not in columns:
|
||||
conn.execute("ALTER TABLE trades ADD COLUMN mode TEXT DEFAULT 'paper'")
|
||||
session_id_added = False
|
||||
if "session_id" not in columns:
|
||||
conn.execute("ALTER TABLE trades ADD COLUMN session_id TEXT DEFAULT 'UNKNOWN'")
|
||||
session_id_added = True
|
||||
if "strategy_pnl" not in columns:
|
||||
conn.execute("ALTER TABLE trades ADD COLUMN strategy_pnl REAL DEFAULT 0.0")
|
||||
if "fx_pnl" not in columns:
|
||||
conn.execute("ALTER TABLE trades ADD COLUMN fx_pnl REAL DEFAULT 0.0")
|
||||
# Backfill legacy rows where only pnl existed before split accounting columns.
|
||||
conn.execute(
|
||||
"""
|
||||
UPDATE trades
|
||||
SET strategy_pnl = pnl, fx_pnl = 0.0
|
||||
WHERE pnl != 0.0
|
||||
AND strategy_pnl = 0.0
|
||||
AND fx_pnl = 0.0
|
||||
"""
|
||||
)
|
||||
if session_id_added:
|
||||
conn.execute(
|
||||
"""
|
||||
UPDATE trades
|
||||
SET session_id = 'UNKNOWN'
|
||||
WHERE session_id IS NULL OR session_id = ''
|
||||
"""
|
||||
)
|
||||
|
||||
# Context tree tables for multi-layered memory management
|
||||
conn.execute(
|
||||
@@ -171,8 +201,11 @@ def log_trade(
|
||||
quantity: int = 0,
|
||||
price: float = 0.0,
|
||||
pnl: float = 0.0,
|
||||
strategy_pnl: float | None = None,
|
||||
fx_pnl: float | None = None,
|
||||
market: str = "KR",
|
||||
exchange_code: str = "KRX",
|
||||
session_id: str | None = None,
|
||||
selection_context: dict[str, any] | None = None,
|
||||
decision_id: str | None = None,
|
||||
mode: str = "paper",
|
||||
@@ -187,24 +220,37 @@ def log_trade(
|
||||
rationale: AI decision rationale
|
||||
quantity: Number of shares
|
||||
price: Trade price
|
||||
pnl: Profit/loss
|
||||
pnl: Total profit/loss (backward compatibility)
|
||||
strategy_pnl: Strategy PnL component
|
||||
fx_pnl: FX PnL component
|
||||
market: Market code
|
||||
exchange_code: Exchange code
|
||||
session_id: Session identifier (if omitted, auto-derived from market)
|
||||
selection_context: Scanner selection data (RSI, volume_ratio, signal, score)
|
||||
decision_id: Unique decision identifier for audit linking
|
||||
mode: Trading mode ('paper' or 'live') for data separation
|
||||
"""
|
||||
# Serialize selection context to JSON
|
||||
context_json = json.dumps(selection_context) if selection_context else None
|
||||
resolved_session_id = _resolve_session_id(market=market, session_id=session_id)
|
||||
if strategy_pnl is None and fx_pnl is None:
|
||||
strategy_pnl = pnl
|
||||
fx_pnl = 0.0
|
||||
elif strategy_pnl is None:
|
||||
strategy_pnl = pnl - float(fx_pnl or 0.0) if pnl != 0.0 else 0.0
|
||||
elif fx_pnl is None:
|
||||
fx_pnl = pnl - float(strategy_pnl) if pnl != 0.0 else 0.0
|
||||
if pnl == 0.0 and (strategy_pnl or fx_pnl):
|
||||
pnl = float(strategy_pnl) + float(fx_pnl)
|
||||
|
||||
conn.execute(
|
||||
"""
|
||||
INSERT INTO trades (
|
||||
timestamp, stock_code, action, confidence, rationale,
|
||||
quantity, price, pnl, market, exchange_code, selection_context, decision_id,
|
||||
mode
|
||||
quantity, price, pnl, strategy_pnl, fx_pnl,
|
||||
market, exchange_code, session_id, selection_context, decision_id, mode
|
||||
)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?)
|
||||
""",
|
||||
(
|
||||
datetime.now(UTC).isoformat(),
|
||||
@@ -215,8 +261,11 @@ def log_trade(
|
||||
quantity,
|
||||
price,
|
||||
pnl,
|
||||
strategy_pnl,
|
||||
fx_pnl,
|
||||
market,
|
||||
exchange_code,
|
||||
resolved_session_id,
|
||||
context_json,
|
||||
decision_id,
|
||||
mode,
|
||||
@@ -225,6 +274,21 @@ def log_trade(
|
||||
conn.commit()
|
||||
|
||||
|
||||
def _resolve_session_id(*, market: str, session_id: str | None) -> str:
|
||||
if session_id:
|
||||
return session_id
|
||||
try:
|
||||
from src.core.order_policy import classify_session_id
|
||||
from src.markets.schedule import MARKETS
|
||||
|
||||
market_info = MARKETS.get(market)
|
||||
if market_info is not None:
|
||||
return classify_session_id(market_info)
|
||||
except Exception:
|
||||
pass
|
||||
return "UNKNOWN"
|
||||
|
||||
|
||||
def get_latest_buy_trade(
|
||||
conn: sqlite3.Connection, stock_code: str, market: str
|
||||
) -> dict[str, Any] | None:
|
||||
|
||||
78
src/main.py
78
src/main.py
@@ -33,7 +33,11 @@ from src.core.blackout_manager import (
|
||||
parse_blackout_windows_kst,
|
||||
)
|
||||
from src.core.kill_switch import KillSwitchOrchestrator
|
||||
from src.core.order_policy import OrderPolicyRejected, validate_order_policy
|
||||
from src.core.order_policy import (
|
||||
OrderPolicyRejected,
|
||||
get_session_info,
|
||||
validate_order_policy,
|
||||
)
|
||||
from src.core.priority_queue import PriorityTaskQueue
|
||||
from src.core.risk_manager import CircuitBreakerTripped, FatFingerRejected, RiskManager
|
||||
from src.db import (
|
||||
@@ -63,6 +67,7 @@ BLACKOUT_ORDER_MANAGER = BlackoutOrderManager(
|
||||
windows=[],
|
||||
max_queue_size=500,
|
||||
)
|
||||
_SESSION_CLOSE_WINDOWS = {"NXT_AFTER", "US_AFTER"}
|
||||
|
||||
|
||||
def safe_float(value: str | float | None, default: float = 0.0) -> float:
|
||||
@@ -449,6 +454,21 @@ def _should_block_overseas_buy_for_fx_buffer(
|
||||
return remaining < required, remaining, required
|
||||
|
||||
|
||||
def _should_force_exit_for_overnight(
|
||||
*,
|
||||
market: MarketInfo,
|
||||
settings: Settings | None,
|
||||
) -> bool:
|
||||
session_id = get_session_info(market).session_id
|
||||
if session_id not in _SESSION_CLOSE_WINDOWS:
|
||||
return False
|
||||
if KILL_SWITCH.new_orders_blocked:
|
||||
return True
|
||||
if settings is None:
|
||||
return False
|
||||
return not settings.OVERNIGHT_EXCEPTION_ENABLED
|
||||
|
||||
|
||||
async def build_overseas_symbol_universe(
|
||||
db_conn: Any,
|
||||
overseas_broker: OverseasBroker,
|
||||
@@ -1214,6 +1234,23 @@ async def trading_cycle(
|
||||
loss_pct,
|
||||
take_profit_threshold,
|
||||
)
|
||||
if decision.action == "HOLD" and _should_force_exit_for_overnight(
|
||||
market=market,
|
||||
settings=settings,
|
||||
):
|
||||
decision = TradeDecision(
|
||||
action="SELL",
|
||||
confidence=max(decision.confidence, 85),
|
||||
rationale=(
|
||||
"Forced exit by overnight policy"
|
||||
" (session close window / kill switch priority)"
|
||||
),
|
||||
)
|
||||
logger.info(
|
||||
"Overnight policy override for %s (%s): HOLD -> SELL",
|
||||
stock_code,
|
||||
market.name,
|
||||
)
|
||||
logger.info(
|
||||
"Decision for %s (%s): %s (confidence=%d)",
|
||||
stock_code,
|
||||
@@ -1274,7 +1311,7 @@ async def trading_cycle(
|
||||
trade_price = current_price
|
||||
trade_pnl = 0.0
|
||||
if decision.action in ("BUY", "SELL"):
|
||||
if KILL_SWITCH.new_orders_blocked:
|
||||
if KILL_SWITCH.new_orders_blocked and decision.action == "BUY":
|
||||
logger.critical(
|
||||
"KillSwitch block active: skip %s order for %s (%s)",
|
||||
decision.action,
|
||||
@@ -2323,6 +2360,25 @@ async def run_daily_session(
|
||||
stock_code,
|
||||
market.name,
|
||||
)
|
||||
if decision.action == "HOLD":
|
||||
daily_open = get_open_position(db_conn, stock_code, market.code)
|
||||
if daily_open and _should_force_exit_for_overnight(
|
||||
market=market,
|
||||
settings=settings,
|
||||
):
|
||||
decision = TradeDecision(
|
||||
action="SELL",
|
||||
confidence=max(decision.confidence, 85),
|
||||
rationale=(
|
||||
"Forced exit by overnight policy"
|
||||
" (session close window / kill switch priority)"
|
||||
),
|
||||
)
|
||||
logger.info(
|
||||
"Daily overnight policy override for %s (%s): HOLD -> SELL",
|
||||
stock_code,
|
||||
market.name,
|
||||
)
|
||||
|
||||
# Log decision
|
||||
context_snapshot = {
|
||||
@@ -2363,7 +2419,7 @@ async def run_daily_session(
|
||||
trade_pnl = 0.0
|
||||
order_succeeded = True
|
||||
if decision.action in ("BUY", "SELL"):
|
||||
if KILL_SWITCH.new_orders_blocked:
|
||||
if KILL_SWITCH.new_orders_blocked and decision.action == "BUY":
|
||||
logger.critical(
|
||||
"KillSwitch block active: skip %s order for %s (%s)",
|
||||
decision.action,
|
||||
@@ -3352,7 +3408,10 @@ async def run(settings: Settings) -> None:
|
||||
_run_context_scheduler(context_scheduler, now=datetime.now(UTC))
|
||||
|
||||
# Get currently open markets
|
||||
open_markets = get_open_markets(settings.enabled_market_list)
|
||||
open_markets = get_open_markets(
|
||||
settings.enabled_market_list,
|
||||
include_extended_sessions=True,
|
||||
)
|
||||
|
||||
if not open_markets:
|
||||
# Notify market close for any markets that were open
|
||||
@@ -3381,7 +3440,8 @@ async def run(settings: Settings) -> None:
|
||||
# No markets open — wait until next market opens
|
||||
try:
|
||||
next_market, next_open_time = get_next_market_open(
|
||||
settings.enabled_market_list
|
||||
settings.enabled_market_list,
|
||||
include_extended_sessions=True,
|
||||
)
|
||||
now = datetime.now(UTC)
|
||||
wait_seconds = (next_open_time - now).total_seconds()
|
||||
@@ -3403,6 +3463,14 @@ async def run(settings: Settings) -> None:
|
||||
if shutdown.is_set():
|
||||
break
|
||||
|
||||
session_info = get_session_info(market)
|
||||
logger.info(
|
||||
"Market session active: %s (%s) session=%s",
|
||||
market.code,
|
||||
market.name,
|
||||
session_info.session_id,
|
||||
)
|
||||
|
||||
await process_blackout_recovery_orders(
|
||||
broker=broker,
|
||||
overseas_broker=overseas_broker,
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
"""Market schedule management with timezone support."""
|
||||
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime, time, timedelta
|
||||
from datetime import UTC, datetime, time, timedelta
|
||||
from zoneinfo import ZoneInfo
|
||||
|
||||
|
||||
@@ -181,7 +181,10 @@ def is_market_open(market: MarketInfo, now: datetime | None = None) -> bool:
|
||||
|
||||
|
||||
def get_open_markets(
|
||||
enabled_markets: list[str] | None = None, now: datetime | None = None
|
||||
enabled_markets: list[str] | None = None,
|
||||
now: datetime | None = None,
|
||||
*,
|
||||
include_extended_sessions: bool = False,
|
||||
) -> list[MarketInfo]:
|
||||
"""
|
||||
Get list of currently open markets.
|
||||
@@ -196,17 +199,31 @@ def get_open_markets(
|
||||
if enabled_markets is None:
|
||||
enabled_markets = list(MARKETS.keys())
|
||||
|
||||
def is_available(market: MarketInfo) -> bool:
|
||||
if not include_extended_sessions:
|
||||
return is_market_open(market, now)
|
||||
if market.code == "KR" or market.code.startswith("US"):
|
||||
# Import lazily to avoid module cycle at import-time.
|
||||
from src.core.order_policy import classify_session_id
|
||||
|
||||
session_id = classify_session_id(market, now)
|
||||
return session_id not in {"KR_OFF", "US_OFF"}
|
||||
return is_market_open(market, now)
|
||||
|
||||
open_markets = [
|
||||
MARKETS[code]
|
||||
for code in enabled_markets
|
||||
if code in MARKETS and is_market_open(MARKETS[code], now)
|
||||
if code in MARKETS and is_available(MARKETS[code])
|
||||
]
|
||||
|
||||
return sorted(open_markets, key=lambda m: m.code)
|
||||
|
||||
|
||||
def get_next_market_open(
|
||||
enabled_markets: list[str] | None = None, now: datetime | None = None
|
||||
enabled_markets: list[str] | None = None,
|
||||
now: datetime | None = None,
|
||||
*,
|
||||
include_extended_sessions: bool = False,
|
||||
) -> tuple[MarketInfo, datetime]:
|
||||
"""
|
||||
Find the next market that will open and when.
|
||||
@@ -233,6 +250,21 @@ def get_next_market_open(
|
||||
next_open_time: datetime | None = None
|
||||
next_market: MarketInfo | None = None
|
||||
|
||||
def first_extended_open_after(market: MarketInfo, start_utc: datetime) -> datetime | None:
|
||||
# Search minute-by-minute for KR/US session transition into active window.
|
||||
# Bounded to 7 days to match existing behavior.
|
||||
from src.core.order_policy import classify_session_id
|
||||
|
||||
ts = start_utc.astimezone(ZoneInfo("UTC")).replace(second=0, microsecond=0)
|
||||
prev_active = classify_session_id(market, ts) not in {"KR_OFF", "US_OFF"}
|
||||
for _ in range(7 * 24 * 60):
|
||||
ts = ts + timedelta(minutes=1)
|
||||
active = classify_session_id(market, ts) not in {"KR_OFF", "US_OFF"}
|
||||
if active and not prev_active:
|
||||
return ts
|
||||
prev_active = active
|
||||
return None
|
||||
|
||||
for code in enabled_markets:
|
||||
if code not in MARKETS:
|
||||
continue
|
||||
@@ -240,6 +272,13 @@ def get_next_market_open(
|
||||
market = MARKETS[code]
|
||||
market_now = now.astimezone(market.timezone)
|
||||
|
||||
if include_extended_sessions and (market.code == "KR" or market.code.startswith("US")):
|
||||
ext_open = first_extended_open_after(market, now.astimezone(UTC))
|
||||
if ext_open and (next_open_time is None or ext_open < next_open_time):
|
||||
next_open_time = ext_open
|
||||
next_market = market
|
||||
continue
|
||||
|
||||
# Calculate next open time for this market
|
||||
for days_ahead in range(7): # Check next 7 days
|
||||
check_date = market_now.date() + timedelta(days=days_ahead)
|
||||
|
||||
136
tests/test_db.py
136
tests/test_db.py
@@ -155,6 +155,9 @@ def test_mode_column_exists_in_schema() -> None:
|
||||
cursor = conn.execute("PRAGMA table_info(trades)")
|
||||
columns = {row[1] for row in cursor.fetchall()}
|
||||
assert "mode" in columns
|
||||
assert "session_id" in columns
|
||||
assert "strategy_pnl" in columns
|
||||
assert "fx_pnl" in columns
|
||||
|
||||
|
||||
def test_mode_migration_adds_column_to_existing_db() -> None:
|
||||
@@ -182,6 +185,13 @@ def test_mode_migration_adds_column_to_existing_db() -> None:
|
||||
decision_id TEXT
|
||||
)"""
|
||||
)
|
||||
old_conn.execute(
|
||||
"""
|
||||
INSERT INTO trades (
|
||||
timestamp, stock_code, action, confidence, rationale, quantity, price, pnl
|
||||
) VALUES ('2026-01-01T00:00:00+00:00', 'AAPL', 'SELL', 90, 'legacy', 1, 100.0, 123.45)
|
||||
"""
|
||||
)
|
||||
old_conn.commit()
|
||||
old_conn.close()
|
||||
|
||||
@@ -190,6 +200,132 @@ def test_mode_migration_adds_column_to_existing_db() -> None:
|
||||
cursor = conn.execute("PRAGMA table_info(trades)")
|
||||
columns = {row[1] for row in cursor.fetchall()}
|
||||
assert "mode" in columns
|
||||
assert "session_id" in columns
|
||||
assert "strategy_pnl" in columns
|
||||
assert "fx_pnl" in columns
|
||||
migrated = conn.execute(
|
||||
"SELECT pnl, strategy_pnl, fx_pnl, session_id FROM trades WHERE stock_code='AAPL' LIMIT 1"
|
||||
).fetchone()
|
||||
assert migrated is not None
|
||||
assert migrated[0] == 123.45
|
||||
assert migrated[1] == 123.45
|
||||
assert migrated[2] == 0.0
|
||||
assert migrated[3] == "UNKNOWN"
|
||||
conn.close()
|
||||
finally:
|
||||
os.unlink(db_path)
|
||||
|
||||
|
||||
def test_log_trade_stores_strategy_and_fx_pnl_separately() -> None:
|
||||
conn = init_db(":memory:")
|
||||
log_trade(
|
||||
conn=conn,
|
||||
stock_code="AAPL",
|
||||
action="SELL",
|
||||
confidence=90,
|
||||
rationale="fx split",
|
||||
pnl=120.0,
|
||||
strategy_pnl=100.0,
|
||||
fx_pnl=20.0,
|
||||
market="US_NASDAQ",
|
||||
exchange_code="NASD",
|
||||
)
|
||||
row = conn.execute(
|
||||
"SELECT pnl, strategy_pnl, fx_pnl FROM trades ORDER BY id DESC LIMIT 1"
|
||||
).fetchone()
|
||||
assert row is not None
|
||||
assert row[0] == 120.0
|
||||
assert row[1] == 100.0
|
||||
assert row[2] == 20.0
|
||||
|
||||
|
||||
def test_log_trade_backward_compat_sets_strategy_pnl_from_pnl() -> None:
|
||||
conn = init_db(":memory:")
|
||||
log_trade(
|
||||
conn=conn,
|
||||
stock_code="005930",
|
||||
action="SELL",
|
||||
confidence=80,
|
||||
rationale="legacy",
|
||||
pnl=50.0,
|
||||
market="KR",
|
||||
exchange_code="KRX",
|
||||
)
|
||||
row = conn.execute(
|
||||
"SELECT pnl, strategy_pnl, fx_pnl FROM trades ORDER BY id DESC LIMIT 1"
|
||||
).fetchone()
|
||||
assert row is not None
|
||||
assert row[0] == 50.0
|
||||
assert row[1] == 50.0
|
||||
assert row[2] == 0.0
|
||||
|
||||
|
||||
def test_log_trade_partial_fx_input_does_not_infer_negative_strategy_pnl() -> None:
|
||||
conn = init_db(":memory:")
|
||||
log_trade(
|
||||
conn=conn,
|
||||
stock_code="AAPL",
|
||||
action="SELL",
|
||||
confidence=70,
|
||||
rationale="fx only",
|
||||
pnl=0.0,
|
||||
fx_pnl=10.0,
|
||||
market="US_NASDAQ",
|
||||
exchange_code="NASD",
|
||||
)
|
||||
row = conn.execute(
|
||||
"SELECT pnl, strategy_pnl, fx_pnl FROM trades ORDER BY id DESC LIMIT 1"
|
||||
).fetchone()
|
||||
assert row is not None
|
||||
assert row[0] == 10.0
|
||||
assert row[1] == 0.0
|
||||
assert row[2] == 10.0
|
||||
|
||||
|
||||
def test_log_trade_persists_explicit_session_id() -> None:
|
||||
conn = init_db(":memory:")
|
||||
log_trade(
|
||||
conn=conn,
|
||||
stock_code="AAPL",
|
||||
action="BUY",
|
||||
confidence=70,
|
||||
rationale="session test",
|
||||
market="US_NASDAQ",
|
||||
exchange_code="NASD",
|
||||
session_id="US_PRE",
|
||||
)
|
||||
row = conn.execute("SELECT session_id FROM trades ORDER BY id DESC LIMIT 1").fetchone()
|
||||
assert row is not None
|
||||
assert row[0] == "US_PRE"
|
||||
|
||||
|
||||
def test_log_trade_auto_derives_session_id_when_not_provided() -> None:
|
||||
conn = init_db(":memory:")
|
||||
log_trade(
|
||||
conn=conn,
|
||||
stock_code="005930",
|
||||
action="BUY",
|
||||
confidence=70,
|
||||
rationale="auto session",
|
||||
market="KR",
|
||||
exchange_code="KRX",
|
||||
)
|
||||
row = conn.execute("SELECT session_id FROM trades ORDER BY id DESC LIMIT 1").fetchone()
|
||||
assert row is not None
|
||||
assert row[0] != "UNKNOWN"
|
||||
|
||||
|
||||
def test_log_trade_unknown_market_falls_back_to_unknown_session() -> None:
|
||||
conn = init_db(":memory:")
|
||||
log_trade(
|
||||
conn=conn,
|
||||
stock_code="X",
|
||||
action="BUY",
|
||||
confidence=70,
|
||||
rationale="unknown market",
|
||||
market="MARS",
|
||||
exchange_code="MARS",
|
||||
)
|
||||
row = conn.execute("SELECT session_id FROM trades ORDER BY id DESC LIMIT 1").fetchone()
|
||||
assert row is not None
|
||||
assert row[0] == "UNKNOWN"
|
||||
|
||||
@@ -15,6 +15,7 @@ from src.evolution.scorecard import DailyScorecard
|
||||
from src.logging.decision_logger import DecisionLogger
|
||||
from src.main import (
|
||||
KILL_SWITCH,
|
||||
_should_force_exit_for_overnight,
|
||||
_should_block_overseas_buy_for_fx_buffer,
|
||||
_trigger_emergency_kill_switch,
|
||||
_apply_dashboard_flag,
|
||||
@@ -5310,6 +5311,88 @@ async def test_order_policy_rejection_skips_order_execution() -> None:
|
||||
broker.send_order.assert_not_called()
|
||||
|
||||
|
||||
def test_overnight_policy_prioritizes_killswitch_over_exception() -> None:
|
||||
market = MagicMock()
|
||||
with patch("src.main.get_session_info", return_value=MagicMock(session_id="US_AFTER")):
|
||||
settings = MagicMock()
|
||||
settings.OVERNIGHT_EXCEPTION_ENABLED = True
|
||||
try:
|
||||
KILL_SWITCH.new_orders_blocked = True
|
||||
assert _should_force_exit_for_overnight(market=market, settings=settings)
|
||||
finally:
|
||||
KILL_SWITCH.clear_block()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_kill_switch_block_does_not_block_sell_reduction() -> None:
|
||||
"""KillSwitch should block BUY entries, but allow SELL risk reduction orders."""
|
||||
db_conn = init_db(":memory:")
|
||||
decision_logger = DecisionLogger(db_conn)
|
||||
|
||||
broker = MagicMock()
|
||||
broker.get_current_price = AsyncMock(return_value=(100.0, 0.5, 0.0))
|
||||
broker.get_balance = AsyncMock(
|
||||
return_value={
|
||||
"output1": [{"pdno": "005930", "ord_psbl_qty": "3"}],
|
||||
"output2": [
|
||||
{
|
||||
"tot_evlu_amt": "100000",
|
||||
"dnca_tot_amt": "50000",
|
||||
"pchs_amt_smtl_amt": "50000",
|
||||
}
|
||||
],
|
||||
}
|
||||
)
|
||||
broker.send_order = AsyncMock(return_value={"msg1": "OK"})
|
||||
|
||||
market = MagicMock()
|
||||
market.name = "Korea"
|
||||
market.code = "KR"
|
||||
market.exchange_code = "KRX"
|
||||
market.is_domestic = True
|
||||
|
||||
telegram = MagicMock()
|
||||
telegram.notify_trade_execution = AsyncMock()
|
||||
telegram.notify_fat_finger = AsyncMock()
|
||||
telegram.notify_circuit_breaker = AsyncMock()
|
||||
telegram.notify_scenario_matched = AsyncMock()
|
||||
|
||||
settings = MagicMock()
|
||||
settings.POSITION_SIZING_ENABLED = False
|
||||
settings.CONFIDENCE_THRESHOLD = 80
|
||||
settings.OVERNIGHT_EXCEPTION_ENABLED = True
|
||||
settings.MODE = "paper"
|
||||
|
||||
try:
|
||||
KILL_SWITCH.new_orders_blocked = True
|
||||
await trading_cycle(
|
||||
broker=broker,
|
||||
overseas_broker=MagicMock(),
|
||||
scenario_engine=MagicMock(evaluate=MagicMock(return_value=_make_sell_match())),
|
||||
playbook=_make_playbook(),
|
||||
risk=MagicMock(),
|
||||
db_conn=db_conn,
|
||||
decision_logger=decision_logger,
|
||||
context_store=MagicMock(
|
||||
get_latest_timeframe=MagicMock(return_value=None),
|
||||
set_context=MagicMock(),
|
||||
),
|
||||
criticality_assessor=MagicMock(
|
||||
assess_market_conditions=MagicMock(return_value=MagicMock(value="NORMAL")),
|
||||
get_timeout=MagicMock(return_value=5.0),
|
||||
),
|
||||
telegram=telegram,
|
||||
market=market,
|
||||
stock_code="005930",
|
||||
scan_candidates={},
|
||||
settings=settings,
|
||||
)
|
||||
finally:
|
||||
KILL_SWITCH.clear_block()
|
||||
|
||||
broker.send_order.assert_called_once()
|
||||
|
||||
|
||||
@pytest.mark.asyncio
|
||||
async def test_blackout_queues_order_and_skips_submission() -> None:
|
||||
"""When blackout is active, order submission is replaced by queueing."""
|
||||
|
||||
@@ -147,6 +147,24 @@ class TestGetOpenMarkets:
|
||||
codes = [m.code for m in open_markets]
|
||||
assert codes == sorted(codes)
|
||||
|
||||
def test_get_open_markets_us_pre_extended_session(self) -> None:
|
||||
"""US premarket should be considered open when extended sessions enabled."""
|
||||
# Monday 2026-02-02 08:30 EST = 13:30 UTC (premarket window)
|
||||
test_time = datetime(2026, 2, 2, 13, 30, tzinfo=ZoneInfo("UTC"))
|
||||
|
||||
regular = get_open_markets(
|
||||
enabled_markets=["US_NASDAQ", "US_NYSE", "US_AMEX"],
|
||||
now=test_time,
|
||||
)
|
||||
assert regular == []
|
||||
|
||||
extended = get_open_markets(
|
||||
enabled_markets=["US_NASDAQ", "US_NYSE", "US_AMEX"],
|
||||
now=test_time,
|
||||
include_extended_sessions=True,
|
||||
)
|
||||
assert {m.code for m in extended} == {"US_NASDAQ", "US_NYSE", "US_AMEX"}
|
||||
|
||||
|
||||
class TestGetNextMarketOpen:
|
||||
"""Test get_next_market_open function."""
|
||||
@@ -201,6 +219,20 @@ class TestGetNextMarketOpen:
|
||||
)
|
||||
assert market.code == "KR"
|
||||
|
||||
def test_get_next_market_open_prefers_extended_session(self) -> None:
|
||||
"""Extended lookup should return premarket open time before regular open."""
|
||||
# Monday 2026-02-02 07:00 EST = 12:00 UTC
|
||||
# By v3 KST session rules, US is OFF only in KST 07:00-10:00 (UTC 22:00-01:00).
|
||||
# At 12:00 UTC market is active, so next OFF->ON transition is 01:00 UTC next day.
|
||||
test_time = datetime(2026, 2, 2, 12, 0, tzinfo=ZoneInfo("UTC"))
|
||||
market, next_open = get_next_market_open(
|
||||
enabled_markets=["US_NASDAQ"],
|
||||
now=test_time,
|
||||
include_extended_sessions=True,
|
||||
)
|
||||
assert market.code == "US_NASDAQ"
|
||||
assert next_open == datetime(2026, 2, 3, 1, 0, tzinfo=ZoneInfo("UTC"))
|
||||
|
||||
|
||||
class TestExpandMarketCodes:
|
||||
"""Test shorthand market expansion."""
|
||||
|
||||
Reference in New Issue
Block a user