pve-vm-setup/TASKS.md
2026-03-08 18:32:25 +01:00

12 KiB

TASKS

Codex working agreement

Use these rules for every implementation task in this repository:

  • Write tests first.
  • Use uv for all Python commands and dependency management.
  • Use ruff for linting and formatting.
  • Prefer small vertical slices that can be implemented and verified independently.
  • Keep Proxmox API and business logic out of Textual widgets where possible.
  • Field-like widgets should use compact 3-row layouts by default; bordered fields must not include an empty inner spacer row between the content and the border.
  • Every screen must have explicit default, loading, success, empty, and error states where applicable.
  • Every interactive screen must be covered by Textual interaction tests using run_test() and Pilot.
  • Important visual states should be covered by snapshot tests.
  • At the end of each task, summarize what changed and list unresolved UX concerns.

Repository commands

Codex should use these commands:

  • Install dependencies: uv sync
  • Run app: uv run python -m pve_vm_setup
  • Run tests: uv run pytest
  • Run lint checks: uv run ruff check .
  • Format code: uv run ruff format .

Shared visual validation

  • Use snapshot tests and visual review for form-heavy screens to confirm that single-line field widgets stay 3 rows high by default.
  • Verify that bordered field-like controls do not render an empty spacer line above or below the value row.
  • Allow taller controls only when a widget is intentionally multi-line or needs additional vertical content, and call that out in the task summary as a UX choice.

Implementation backlog

Task 1: Bootstrap the Textual app and project structure

Create the initial Textual application structure and make the repository runnable and testable.

Requirements:

  • Create the application entrypoint used by uv run python -m pve_vm_setup.
  • Set up a project structure that separates app shell, screens, widgets, models, and services.
  • Add the initial test setup for unit tests, Textual interaction tests, and snapshot tests.
  • Add a central state or domain module for the VM configuration workflow.
  • Add a service interface for Proxmox API access so UI code can be tested with fakes or mocks.

Definition of done:

  • Add or update tests first.
  • uv run pytest passes.
  • uv run ruff check . passes.
  • uv run ruff format . has been run.
  • The app opens a minimal Textual shell screen.
  • Summarize what changed and list unresolved UX concerns.

Task 2: Implement authentication screen and Proxmox login flow

Build the login screen shown in SPEC.md and authenticate before the VM creation workflow starts.

Requirements:

  • Ask the user for username and password.
  • Load available authentication realms from the Proxmox API.
  • Let the user choose the authentication realm.
  • Default the authentication realm to Linux PAM standard authentication.
  • Handle authentication failure and loading states.
  • Prevent the wizard from continuing until authentication succeeds.

Definition of done:

  • Add or update tests first.
  • Add service tests for realm loading and login handling.
  • Add Textual interaction tests using run_test() and Pilot for the login flow.
  • Add snapshot coverage for default, loading, and authentication-error states.
  • Summarize what changed and list unresolved UX concerns.

Task 3: Implement the general VM configuration screen

Build the general configuration screen from SPEC.md.

Fields:

  • Node
  • VM ID
    • Default: next free VM ID above 100
  • Name
  • Resource Pool
    • Default: empty
    • Empty is allowed
    • Load available resource pools from the server
    • The pool list may be empty
  • Tags
    • List existing tags
    • Allow adding tags one by one
    • Allow removing tags
  • High Availability (HA)
    • Default: enabled
  • Start at boot
    • Default: disabled
  • Start/shutdown order
    • Optional
  • Startup delay
    • Optional
  • Shutdown timeout
    • Optional

Definition of done:

  • Add or update tests first.
  • Cover default values, validation, optional fields, and empty pool lists.
  • Add service tests for loading nodes, next VM ID, pools, and existing tags.
  • Add Textual interaction tests using run_test() and Pilot.
  • Add snapshot coverage for the default general screen and an empty-pool state.
  • Summarize what changed and list unresolved UX concerns.

Task 4: Implement the OS selection screen

Build the OS configuration screen from SPEC.md.

Requirements:

  • Ask whether the user wants to use installation media.
  • Valid choices:
    • ISO
    • Physical disc drive
    • No
  • If the user selects ISO:
    • Load available storages from the Proxmox API
    • Default storage to cephfs
    • When the storage changes, reload the available ISOs
    • The ISO list may be empty
    • Let the user choose the ISO
    • Default ISO to the latest available NixOS minimal ISO matching this pattern:
      • nixos-minimal-<two digit year>-<two digit month>.<some alphanumeric id or hash>.<architecture>-linux.iso
  • Let the user select guest type and version
    • Default type: Linux
    • Default version: 6.x - 2.6 Kernel

Definition of done:

  • Add or update tests first.
  • Add service tests for storage and ISO loading and NixOS ISO default selection.
  • Add Textual interaction tests for switching media type, storage, and ISO selection.
  • Add snapshot coverage for:
    • default OS screen
    • ISO selected with results
    • ISO selected with no results
  • Summarize what changed and list unresolved UX concerns.

Task 5: Implement the system configuration screen

Build the system configuration screen from SPEC.md.

Fields:

  • Graphic card
    • Default: Default
  • Machine
    • Default: q35
  • Firmware BIOS
    • Default: OVMF (UEFI)
  • Add EFI Disk
    • Default: enabled
  • EFI storage
    • Default: ceph-pool
  • Pre-Enroll keys
    • Default: disabled
  • SCSI Controller
    • Default: VirtIO SCSI single
  • Qemu Agent
    • Default: enabled
  • TPM
    • Default: disabled

Definition of done:

  • Add or update tests first.
  • Model system settings separately from widgets.
  • Add Textual interaction tests using run_test() and Pilot.
  • Add snapshot coverage for the default system screen.
  • Summarize what changed and list unresolved UX concerns.

Task 6: Implement the disks configuration screen

Build the disks configuration screen from SPEC.md.

Requirements:

  • Allow the user to configure zero to multiple disks.
  • Allow the user to add, modify, and remove disks one by one.
  • For each disk, support:
    • Bus/Device
      • Default bus: SCSI
      • Default device: increasing integer starting from 0 such as scsi0, scsi1, ...
    • Storage
      • Default: ceph-pool
    • Disk size in GiB
      • Default: 32
    • Format
      • Default: RAW
      • Not changeable
    • Cache
      • Default: no cache
    • Discard
      • Default: disabled
    • IO Thread
      • Default: enabled
    • SSD emulation
      • Default: enabled
    • Backup
      • Default: enabled
    • Skip replication
      • Default: disabled
    • Async IO
      • Default: io_uring

Definition of done:

  • Add or update tests first.
  • Model disk configuration separately from widgets.
  • Add service tests for available storage loading if required by the UI.
  • Add Textual interaction tests for add, edit, and remove disk flows.
  • Add snapshot coverage for the default disk state and a multi-disk state.
  • Summarize what changed and list unresolved UX concerns.

Task 7: Implement the CPU configuration screen

Build the CPU configuration screen from SPEC.md.

Fields:

  • Cores
    • Default: 2
  • Sockets
    • Default: 1
  • CPU Type
    • Default: host

Definition of done:

  • Add or update tests first.
  • Cover default values and validation.
  • Add Textual interaction tests using run_test() and Pilot.
  • Add snapshot coverage for the default CPU screen.
  • Summarize what changed and list unresolved UX concerns.

Task 8: Implement the memory configuration screen

Build the memory configuration screen from SPEC.md.

Fields:

  • Memory size in MiB
    • Default: 2048
  • Min Memory in MiB
    • Default: same as Memory size
  • Ballooning
    • Default: enabled
  • Allow KSM
    • Default: enabled

Definition of done:

  • Add or update tests first.
  • Cover default values, derived defaults, and validation.
  • Add Textual interaction tests using run_test() and Pilot.
  • Add snapshot coverage for the default memory screen.
  • Summarize what changed and list unresolved UX concerns.

Task 9: Implement the network configuration screen

Build the network configuration screen from SPEC.md.

Fields:

  • No Network device
    • Default: disabled
  • Bridge
    • Default: vmbr9
  • VLAN Tag
    • Default: none
  • Model
    • Default: virtio
  • MAC Address
    • Default: auto-generated by the API
  • Firewall
    • Default: enabled
  • Disconnected
    • Default: disabled
  • MTU
    • Default: none
  • Rate Limit in MB/s
    • Default: none
  • Multiqueue
    • Default: none

Definition of done:

  • Add or update tests first.
  • Cover defaults, validation, and no-network behavior.
  • Add Textual interaction tests using run_test() and Pilot.
  • Add snapshot coverage for the default network screen and the no-network state.
  • Summarize what changed and list unresolved UX concerns.

Task 10: Implement the confirmation screen

Build the confirmation screen from SPEC.md.

Requirements:

  • Display a summary of the full VM configuration collected in previous steps.
  • Clearly show all relevant settings before submission.
  • Show validation issues or missing required inputs.
  • Provide a Create VM button.

Definition of done:

  • Add or update tests first.
  • Add Textual interaction tests for reaching the confirmation step.
  • Add snapshot coverage for a fully populated confirmation screen and a validation-error state.
  • Summarize what changed and list unresolved UX concerns.

Task 11: Implement VM creation against the Proxmox API

Submit the VM creation request after confirmation.

Requirements:

  • Translate the collected workflow state into the correct Proxmox API request payload.
  • Use the Proxmox VE API to create the VM.
  • After VM creation succeeds, send the required follow-up configuration request(s) to:
    • add serial0: socket
    • set vga: serial0
  • Treat serial-console configuration as part of the overall success criteria because it is required for using the Proxmox VE xterm.js serial console.
  • Handle API request errors cleanly.
  • Distinguish between:
    • VM creation failure before the VM exists
    • post-creation serial-console configuration failure after the VM already exists
  • Show progress, success, and failure states.
  • Preserve enough information in the UI so the user can understand what failed.

Definition of done:

  • Add or update tests first.
  • Add service tests for payload building, request sequencing, and error handling.
  • Add Textual interaction tests for submission, full success, VM-create failure, and post-create serial-console failure flows.
  • Add snapshot coverage for submission, success, VM-create error, and post-create serial-console error states.
  • Summarize what changed and list unresolved UX concerns.

Task 12: Polish navigation, error handling, and documentation

Improve the overall wizard experience and repository documentation.

Requirements:

  • Ensure step-to-step navigation is clear and keyboard-friendly.
  • Make back/next/confirm actions predictable across screens.
  • Standardize loading, empty, success, and error messaging.
  • Update README.md if the real module name or run command differs from the placeholder.
  • Document any remaining constraints, assumptions, or known gaps.

Definition of done:

  • Add or update tests first where behavior changes.
  • Add or update interaction tests for navigation flows.
  • Add snapshot coverage for any changed visual states.
  • Summarize what changed and list unresolved UX concerns.

API references

  • Proxmox API documentation: https://pve.proxmox.com/wiki/Proxmox_VE_API
  • Proxmox API schema / viewer: https://pve.proxmox.com/pve-docs/api-viewer/index.html

Suggested prompt template for Codex

Use this prompt pattern for each task:

Implement TASKS.md task N. First add or update tests. Use Textual interaction tests with run_test() and Pilot. Add snapshot coverage for relevant default, loading, empty, success, and error states. Use uv for commands and ruff for linting / formatting. Keep Proxmox API logic in a service layer. Run uv run pytest, uv run ruff check ., and uv run ruff format .. Summarize what changed and list unresolved UX concerns.