Stable
neurovnv0.1.1

Code to Canvas

Understand the actual Neurovn code-to-canvas path: your workflow runs locally, Neurovn derives nodes and edges, the backend persists a remote canvas, and you open that canvas in the editor.

Purpose

Use this guide when you want the plain-English mental model before choosing CLI, decorators, or a framework-specific pattern. It explains what is local, what is remote, and when manual import/export is a separate workflow rather than the main ingestion path.

The primary path is direct backend ingestion, not manual import after the fact.
CLI and decorators both end by posting a trace session that creates or updates a remote canvas.
CLI prints a canvas link, and explicit `trace.session(...)` runs now print a matching summary too.
Implicit auto-sessions created by standalone decorators remain intentionally quiet to avoid noisy logs.
`.neurovn.json` import/export exists as a separate UI workflow for snapshots and handoff.

Developer Flow

1

Instrument or prepare input

Either add `@trace.agent` / `@trace.tool` decorators to Python code or prepare a workflow JSON file for the CLI.

2

Run locally

Execute your app or `neurovn trace ...` on your machine. Neurovn observes the workflow structure locally first.

3

Build graph payload

Neurovn turns that run into `nodes` and `edges`, then asks the backend for an estimate.

4

Persist remote canvas

The backend stores the graph in Neurovn's SDK trace tables and creates or updates a canvas record remotely.

5

Open the editor

For CLI runs and explicit `trace.session(...)` runs, copy the printed `Open: .../editor/{canvas_id}` URL. Implicit auto-sessions still persist remotely without terminal output.

6

Open the printed editor link while signed in

When a signed-in user opens an SDK-trace editor link, Neurovn now materializes that traced canvas into the normal My Canvases tables and redirects to the regular canvas ID.

7

Use manual import/export only when needed

The frontend `.neurovn.json` export/import flow is useful for snapshots, backups, or handoff between people. It is not the primary code-to-canvas ingestion path.

Install & Run

Install

Setup
python -m venv .venvsource .venv/bin/activatepip install --upgrade neurovn

Most teams start here, then choose either the CLI guide or the decorator guide based on whether they already have workflow JSON or existing Python runtime code.

Run

Execute
# CLI pathNEUROVN_API_URL=https://agentic-flow.onrender.com neurovn trace ./workflow.json --workflow-name "Research Workflow" --canvas-name "Research Workflow" --source cli# Decorator pathNEUROVN_API_URL=https://agentic-flow.onrender.com python your_workflow.py

Both paths create remote Neurovn canvases. The difference is how the graph is captured: explicit JSON for CLI, observed runtime structure for decorators.

Architecture Flow

Local execution

Your code or workflow JSON lives on your machine. Neurovn first reads or observes it locally.

Backend ingestion

The SDK/CLI posts estimation and trace-session payloads to the Neurovn backend rather than asking the user to upload a file later.

Remote canvas

The backend persists a canvas record that the editor later hydrates by `canvas_id`.

Implementation Snippets

Mental model

text
local code or workflow.jsonNeurovn derives nodes + edgesPOST /api/estimatePOST /api/traces/sessionsbackend creates remote canvasopen /editor/{canvas_id}

CLI path in one command

bash
NEUROVN_API_URL=https://agentic-flow.onrender.com neurovn trace ./workflow.json --workflow-name "My Workflow" --canvas-name "My Workflow" --source cli

Decorator path in one script

python
from neurovn import trace@trace.agent(name="Planner", model="gpt-4o", provider="OpenAI")def plan(query: str) -> dict:    return {"query": query}@trace.tool(name="Web Search", tool_id="mcp_web_search", tool_category="mcp_server")def search(query: str) -> str:    return "results"with trace.session("Planner Workflow", source="decorator", canvas_name="Planner Workflow"):    step = plan("latest model pricing updates")    docs = search(step["query"])    print(docs)

Manual `.neurovn.json` is a separate path

text
Use UI export/import when you want:- a point-in-time canvas snapshot- a file to share with another person- an exact reload of canvas layout and metadataDo not treat it as the required final step after CLI or decorators.

Reference

ItemDetails
CLI ingestLocal workflow JSON -> `POST /api/estimate` -> `POST /api/traces/sessions` -> remote canvas
Decorator ingestWrapped runtime execution -> in-memory trace session -> estimate -> `POST /api/traces/sessions` -> remote canvas
Session result accessAfter a successful explicit session, `trace.last_result` contains the latest `canvas_id`, `trace_session_id`, and estimation payload.
Canvas storageTrace-generated canvases and sessions are persisted remotely in `sdk_canvases` and `sdk_trace_sessions`.
Manual file flowUI export/import uses `.neurovn.json` for snapshots and exact reloads of canvas state.
Framework reality todayLangGraph and CrewAI guides are primarily decorator patterns or JSON ingestion today, not full automatic parsing of source code.

Troubleshooting

If you expected a browser window to open automatically, that is not the current implementation. CLI and explicit sessions print a URL, but neither one opens the browser for you.

If you use an explicit `trace.session(...)`, the SDK now prints the summary and editor link for that run. Only implicit auto-sessions remain quiet.

If you expected a local export artifact after a CLI or decorator run, that is not the primary path. Those flows persist remotely.

SDK traces are still persisted in separate backend tables first, but opening the editor link while signed in now imports them into the regular My Canvases experience automatically.

If you want portable snapshots or handoff files, use the frontend `.neurovn.json` export/import flow separately.

If you are integrating LangGraph or CrewAI today, treat the framework guides as capture patterns, not as proof of a full source parser.

Related Integrations

Backend contracts: `/api/estimate`, `/api/traces/sessions`, `/api/canvases`