Argon is a browser-based motion analysis and character animation pipeline. Upload reference footage → Argon extracts skeleton pose and facial expression data → apply that data to drive a character image in your visual style.
The entire pipeline runs serverless GPU in the cloud (Modal.com). The interface lives on Vercel. Collaborators open arg0n.dev — nothing to install.
Designed for animator/designer workflows: non-technical users interact through a React UI, technical users can hit the API directly. CivitAI LoRAs are downloadable and cached server-side for injection into any generation or transfer request.
/api/:path* → Modal URLArgonRuntime cls (warm container, 300s idle timeout)modal.Dict → cross-request job statemodal.Volume → /models/loras/ persistent LoRA cache
rewrites proxy all /api/* calls directly to Modal. Zero additional infrastructure./models/loras/ persists across deploys and restarts. Download a CivitAI LoRA once — cached forever for all sessions.modal.Dict shared KV across containers. Long jobs return jobId immediately; clients poll /api/jobs/:id.| Layer | Technology | Notes |
|---|---|---|
| Frontend | React 18, ReactFlow, Framer Motion, Three.js | CRA — Vercel |
| API client | argon-client.js | Relative /api/* in prod, localhost:7860 in dev |
| GPU backend | Modal.com (Python) | FastAPI + modal.asgi_app() — L4 default |
| Inference | ComfyUI + custom nodes | DWPose, LivePortraitKJ, BiRefNet-ZHO |
| Face analysis | MediaPipe | 468-point face mesh, always available |
| LoRA source | CivitAI API | Download by versionId, cache to Modal Volume |
| Job state | modal.Dict | Cross-container shared KV |
| Model storage | modal.Volume | Persistent /models/ across deploys |
| Local dev | argon-server.js (Node.js) | Zero-dep mock on port 7860 |
Core data primitive: ExpressionCoefficients — 16 continuous 0–1 values extracted from face analysis, used to drive character animation.
// ExpressionCoefficients
{
jaw, mouthOpen, mouthCornerUp, mouthCornerDown,
lipPucker, lipStretch,
browInner, browOuter, browFurrow,
eyeWide, eyeClose, eyeSquint,
cheekRaise, noseFlair, noseWrinkle,
intensity, // overall expression energy
emotionVector: {
valence: -1.0 to 1.0, // +/- sentiment
arousal: 0.0 to 1.0, // calm to excited
dominance: 0.0 to 1.0, // passive to dominant
},
emotionClass: { // softmax
neutral, happy, sad, angry,
surprised, fearful, disgusted, contempt
}
}
Pass BeatEmotionCurve into sequence transfer — expression intensity peaks on downbeats.
// BeatEmotionCurve
{
trackDurationMs: 180000,
beats: [
{ timeMs: 0, strength: 1.0 },
{ timeMs: 500, strength: 0.8 },
{ timeMs: 1000, strength: 1.0 },
...
]
}
// any generate/transfer request
{
loraPaths: [
"/models/loras/style_12345.safetensors",
"/models/loras/face_67890.safetensors"
]
}
Drive a character with a real performer's motion and expression, in your visual style.
Upload reference video. Argon extracts body skeleton + expression per frame. Returns MotionTrack ID.
Upload character still. Paste MotionTrack ID. Add optional style prompt + LoRA paths.
Argon renders each frame: DWPose conditioning + LivePortrait expression drive.
Use frames in your pipeline. Or trigger BiRefNet segmentation for alpha isolation.
Character expressiveness breathes with music — emotion peaks on downbeats.
From DAW or BPM tool: array of { timeMs, strength } per beat.
Upload reference footage or use mock MotionTrack.
Argon multiplies each frame's expression coefficients by proximity-to-beat scalar.
Generate frames in any visual style. Download once, cached forever.
Grab versionId from model URL: civitai.com/models/XXXX?modelVersionId=YYYYY
Pass { versionId }. Downloads, caches to Modal Volume. ~30 seconds. Returns file path.
Pass path as loraPaths: [path]. ComfyUI LoraLoader injects at inference time.
Consistent identity across all pose angles.
Upload reference portrait. Returns identityHash string.
Character prompt + LoRAs + identityLock: hash.
THREE_QUARTER, SIDE, BACK — same identityLock hash.
Clean alpha masks for compositing pipelines.
Upload image. Select region: face, hair, body, or full.
BiRefNet returns base64 PNG alpha mask. Works on photo-real and illustrated styles.
Use in After Effects, DaVinci, or any compositing tool.
All endpoints at /api/* — Vercel rewrites proxy to Modal. Auth-free during dev phase.
| Method | Endpoint | Sync/Async | Description |
|---|---|---|---|
| GET | /api/health | sync | Backend status |
| POST | /api/analyze/motion | async | Extract MotionTrack from video/image — returns trackId + jobId |
| POST | /api/analyze/expression | sync | 16 expression coefficients + emotion vector |
| POST | /api/analyze/face | sync | 468 MediaPipe landmarks + ARKit 52 blendshapes |
| POST | /api/analyze/segment | sync | BiRefNet mask for face/hair/body/full |
| POST | /api/transfer/expression | sync | LivePortrait single-frame expression drive |
| POST | /api/transfer/sequence | async | Beat-synced animated sequence with LoRA injection |
| POST | /api/generate/image | async | Text-to-image via ComfyUI (SDXL/Flux) + LoRA |
| POST | /api/generate/video | async | Video generation (Dream Machine backend) |
| POST | /api/generate/pose | async | Pose-conditioned character generation |
| POST | /api/loras/download | sync | Download CivitAI LoRA by versionId to Volume |
| GET | /api/loras | sync | List cached LoRAs |
| GET | /api/jobs/:jobId | sync | Poll job: queued / running / done / error |
| GET | /api/events | SSE | Real-time job updates |
GitHub: ChopperD00/deadweight-argon · branch: main
| File | Purpose | Lines |
|---|---|---|
| modal_server.py | Modal GPU backend. FastAPI + ArgonRuntime class + all endpoints. | ~400 |
| comfy_workflows.py | ComfyUI workflow builders: DWPose extraction, LivePortrait drive, SDXL/Flux gen. | ~300 |
| comfy_helpers.py | Shared ComfyUI utilities: node builders, polling, image encode/decode. | ~150 |
| argon-server.js | Zero-dep Node.js mock server for local dev. Mirrors modal_server.py API exactly. | ~700 |
| src/lib/argon-client.js | React API client. Env-aware, full interface for all endpoints + SSE + polling. | ~250 |
| vercel.json | CRA build config + /api/* rewrite to Modal URL. | ~20 |
| .env.example | All required env vars documented. | ~20 |
| DEPLOYMENT.md | Step-by-step: modal setup → deploy → update vercel.json. | ~100 |
| USER-GUIDE.md | Non-technical guide with 6 theoretical workflows + glossary. | ~250 |
| ARGON-ANALYSIS-SCHEMA.md | Full TypeScript interface spec for all data contracts. | ~500 |