Stable
neurovnv0.1.1

Decorator Integration

Install `neurovn` from PyPI and instrument Python sync/async functions with `trace.agent` and `trace.tool` decorators to emit runtime traces without changing business logic.

Purpose

Use decorator integration when workflows are already implemented in Python and you want runtime-observed traces with minimal code changes.

Published on PyPI as `neurovn` for external users.
Supports both async and sync functions.
Captures nested call relationships as parent/child graph edges.
Auto-creates a session if you do not open one explicitly.
Captures runtime token usage when the returned object exposes a `usage` payload.
Explicit sessions print a terminal summary and editor link after successful persistence.

Developer Flow

1

Install from PyPI

Create a virtualenv and run `pip install --upgrade neurovn`.

2

Point at your backend

Set `NEUROVN_API_URL=https://agentic-flow.onrender.com` so emitted sessions go to the hosted backend.

3

Annotate functions

Add `@trace.agent` to model steps and `@trace.tool` to external call handlers.

4

Optionally scope session

Wrap execution in `with trace.session(...)` to group multiple top-level calls.

5

Execute app

Run your existing app/test flow; decorators capture context, return shape, and duration.

6

Review persisted canvas

On explicit session completion, Neurovn posts the trace, creates a remote canvas, and prints a summary with `canvas_id`, `trace_session_id`, and editor URL. Implicit auto-sessions still persist quietly.

Install & Run

Install

Setup
python -m venv .venvsource .venv/bin/activatepip install --upgrade neurovn

The published install path is `pip install --upgrade neurovn`. Editable installs remain available for contributors inside this repository, but they are no longer the primary onboarding flow.

Run

Execute
NEUROVN_API_URL=https://agentic-flow.onrender.com python your_workflow.py

If your app already manages environment variables, set `NEUROVN_API_URL` once in that environment and run your script normally.

Architecture Flow

Capture

Decorators capture arguments, return payload summaries, and duration per wrapped function.

Graph build

Nested calls are represented as parent/child nodes and connected edges in a session graph.

Emit

Session emits estimation + trace-session payloads to backend endpoints at completion, persists a remote canvas in Neurovn, and stores the response in `trace.last_result`.

Implementation Snippets

Install from PyPI

bash
python -m venv .venvsource .venv/bin/activatepip install --upgrade neurovn

Point the SDK at your backend

bash
export NEUROVN_API_URL=https://agentic-flow.onrender.com

Decorator usage

python
from neurovn import trace@trace.agent(name="Research Agent", model="gpt-4o")async def research(query: str) -> str:    return "summary"@trace.tool(name="Web Search")async def web_search(query: str) -> str:    return "results"

Decorator with explicit tool metadata

python
from neurovn import trace@trace.tool(name="Knowledge Base Search", tool_id="mcp_web_search", tool_category="mcp_server")def kb_search(query: str) -> str:    return "..."

Explicit session

python
from neurovn import tracewith trace.session("My Workflow", source="decorator", canvas_name="My Workflow"):    run_pipeline()

Run your app with the backend URL

bash
NEUROVN_API_URL=https://agentic-flow.onrender.com python your_workflow.py

Read the generated canvas ID

python
from neurovn import tracewith trace.session("My Workflow", source="decorator", canvas_name="My Workflow"):    run_pipeline()if trace.last_result:    print(trace.last_result["canvas_id"])

End-to-end decorator flow

python
from neurovn import trace@trace.agent(name="Planner", model="gpt-4o", provider="OpenAI")def plan(query: str) -> dict:    return {"query": query, "action": "search"}@trace.tool(name="Web Search", tool_id="mcp_web_search", tool_category="mcp_server")def search(query: str) -> str:    return "results"@trace.agent(name="Synthesizer", model="Claude-3.5-Sonnet", provider="Anthropic")def synthesize(context: str) -> str:    return "final answer"with trace.session("Planner workflow", source="decorator"):    step = plan("latest AI infra updates")    docs = search(step["query"])    output = synthesize(docs)    print(output)

Reference

ItemDetails
PyPI package`neurovn` (verified in a fresh virtualenv with `neurovn==0.1.1`)
`trace.agent`Marks an LLM step (`name`, `model`, optional `provider`) and emits `agentNode` metadata.
`trace.tool`Marks external capability calls and emits `toolNode` metadata (`tool_id`, `tool_category` optional).
`trace.session`Context manager for grouping multiple calls into one workflow session payload.
`NEUROVN_API_URL`Backend base URL used by the SDK client when you do not pass a custom base URL directly
`trace.last_result`Holds the latest successful trace-session response so framework integrations can read `canvas_id` programmatically after a run.
Persistence modelDecorator sessions end by posting to `/api/traces/sessions`, which creates or updates a remote canvas in Neurovn.

Troubleshooting

Set `NEUROVN_API_URL` explicitly during local development so traces do not default to the wrong backend.

Use explicit sessions when you want grouped traces across multiple top-level calls.

Use explicit sessions when you want grouped traces, terminal output, and an editor link. Only implicit auto-sessions remain quiet.

If you are signed in and open the printed editor link, Neurovn now imports that traced canvas into My Canvases automatically.

Keep wrapped functions deterministic where possible to compare estimate-vs-actual trends.

For sensitive inputs, avoid logging raw payload fields inside your function bodies.

If traces appear fragmented, ensure nested calls execute within the same process/context manager.

If provider appears incorrect, set explicit `provider` instead of relying on model-name inference.

If trace persistence fails locally, verify the backend is running and its Supabase service-role key is real rather than a placeholder.

Related Integrations

Backend contracts: `/api/estimate`, `/api/traces/sessions`, `/api/canvases`