tenro.linking¶
Decorators for linking agents, LLMs, and tools to Tenro's simulation system.
Use these decorators to mark functions that should be intercepted during testing.
Overview¶
| Decorator | Purpose | Who uses it |
|---|---|---|
@link_tool |
Mark a tool/function call | Everyone |
@link_agent |
Mark an agent entry point | Everyone |
@link_llm |
Mark an LLM API call | Custom agents only (optional) |
Framework users: You don't need @link_llm
If you're using LangChain, CrewAI, LangGraph, or similar frameworks, skip @link_llm. Tenro intercepts LLM calls at the HTTP level, so llm.simulate() works automatically.
See How Tenro works for details.
Quick example¶
from tenro import link_agent, link_tool
@link_tool
def search(query: str) -> list[str]:
return vector_db.search(query)
@link_agent
def researcher(topic: str) -> str:
docs = search(topic)
# LangChain handles LLM calls internally. No @link_llm needed.
prompt = ChatPromptTemplate.from_template("Summarize: {docs}")
chain = prompt | ChatOpenAI() | StrOutputParser()
return chain.invoke({"docs": "\n".join(docs)})
from tenro import link_agent, link_llm, link_tool
@link_tool
def search(query: str) -> list[str]:
"""Search for documents."""
return vector_db.search(query)
@link_llm # Provider inferred from HTTP call
def summarize(text: str) -> str:
"""Summarize text using OpenAI."""
return openai.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": f"Summarize: {text}"}],
)
@link_agent # Uses function name "researcher"
def researcher(topic: str) -> str:
"""Research a topic."""
docs = search(topic)
return summarize(docs)
@link_tool¶
Mark functions that interact with external systems (APIs, databases, file systems).
Usage patterns¶
@link_tool # Uses function name: "get_weather"
@link_tool("weather") # Custom name: "weather"
@link_tool("multi", entry_points="run") # Explicit entry method for classes
@link_tool("multi", entry_points=["run", "stream"]) # Multiple entry methods
If you don't provide a name, Tenro uses the function's own name by default.
Custom names are optional. They're useful for code readability but don't affect how Tenro locates or patches your function.
Function-level decoration¶
@link_tool
def get_weather(city: str) -> dict:
return weather_api.get(city)
@link_tool("db_query")
def query_db(sql: str) -> list:
return db.execute(sql)
Class-level decoration¶
When you decorate a class, Tenro automatically wraps entry methods:
run,invoke,_run,_arun,arun,ainvoke,__call__
@link_tool("search")
class SearchTool:
"""A tool that searches documents."""
def run(self, query: str) -> list[str]: # Automatically wrapped
return self.db.search(query)
def _helper(self): # NOT wrapped (not an entry method)
...
Use entry_points to specify which methods should be wrapped when decorating a class (instead of relying on the default list):
@link_tool("multi_search", entry_points=["search", "fetch"])
class MultiSearchTool:
def search(self, query: str) -> list[str]: ... # Wrapped
def fetch(self, url: str) -> str: ... # Wrapped
def _internal(self): ... # NOT wrapped
This is useful when building tools as classes (common in LangChain and CrewAI):
from langchain.tools import BaseTool
@link_tool("calculator")
class CalculatorTool(BaseTool):
name = "calculator"
description = "Adds two numbers"
def _run(self, a: int, b: int) -> str: # LangChain calls _run
return str(a + b)
Simulation and verification¶
Once you've decorated your function with @link_tool, you can simulate and verify it in tests:
from tenro.simulate import tool
# Pass the function object (recommended, refactor-safe)
tool.simulate(get_weather, result={"temp": 72})
tool.verify_many(get_weather, count=1)
# Or use the full module path as a string
tool.simulate("myapp.tools.get_weather", result={"temp": 72})
Using the function object is recommended because it stays correct when you rename or move the function.
For inspecting recorded calls, see Tracing.
@link_agent¶
Mark top-level agent functions or classes.
Usage patterns¶
@link_agent # Uses function name: "customer_support"
@link_agent("CustomerSupport") # Custom name: "CustomerSupport"
@link_agent("Agent", entry_points=["run"]) # Explicit entry method for classes
If you don't provide a name, Tenro uses the function's own name by default.
The entry_points parameter works the same as for @link_tool - use it to specify custom entry methods for classes.
Examples¶
@link_agent
def customer_support(query: str) -> str:
"""Handle customer support queries."""
...
@link_agent("ResearchAssistant")
class ResearchAssistant:
def run(self, topic: str) -> str:
...
Simulation and verification¶
Once you've decorated your function with @link_agent, you can simulate and verify it in tests:
from tenro.simulate import agent
# Pass the function or class object (recommended, refactor-safe)
agent.simulate(customer_support, result="Issue resolved")
agent.verify_many(customer_support, count=1)
# Or use the full module path as a string
agent.simulate("myapp.agents.customer_support", result="Issue resolved")
Using the function object is recommended because it stays correct when you rename or move the function.
For inspecting recorded runs, see Tracing.
@link_llm¶
Mark functions that call LLM providers directly.
For custom agent builders only
@link_llm is designed for developers building agents with raw LLM SDKs (OpenAI, Anthropic, etc.).
Framework users cannot use it because frameworks like LangChain, CrewAI, and LangGraph hide LLM calls inside their internal implementation. Tenro handles these via HTTP interception automatically.
Usage patterns¶
@link_llm # Provider inferred from HTTP call
@link_llm() # Provider inferred from HTTP call
@link_llm(Provider.OPENAI) # Explicit provider
@link_llm(Provider.OPENAI, model="gpt-4") # With model info
When to use explicit provider¶
Specifying the provider acts as a safeguard:
from tenro import link_llm, Provider
@link_llm(Provider.OPENAI) # Expect OpenAI calls only
def call_openai(prompt: str) -> str:
# If this accidentally calls Anthropic, you'll get an error
return openai.chat.completions.create(...)
If the function makes an HTTP call to a different provider than specified, Tenro will raise an error during validation.
What you gain with @link_llm¶
With @link_llm |
Without @link_llm |
|---|---|
caller_signature: call_openai(prompt: str) -> str |
None |
caller_location: /path/to/file.py:42 |
None |
llm_scope_id: Links LLMCall to the decorated function |
None |
Targeted simulation with target=my_function |
Provider-level simulation only |
| Better error messages showing which function failed | Generic "LLM call failed" |
Bottom line: HTTP interception works without @link_llm. Use @link_llm when you need targeted simulation or want traceability.
Examples¶
from tenro import link_llm, Provider
@link_llm # Provider auto-detected
def call_gpt(prompt: str) -> str:
response = openai.chat.completions.create(
model="gpt-4",
messages=[{"role": "user", "content": prompt}],
)
return response.choices[0].message.content
@link_llm(Provider.ANTHROPIC) # Explicit provider for safeguard
def call_claude(prompt: str) -> str:
response = anthropic.messages.create(
model="claude-3-opus-20240229",
messages=[{"role": "user", "content": prompt}],
)
return response.content[0].text
Supported providers¶
Provider.OPENAI: OpenAI APIProvider.ANTHROPIC: Anthropic APIProvider.GEMINI: Google Gemini API
When to use @link_llm¶
| Scenario | Use @link_llm? |
Reason |
|---|---|---|
| LangChain, CrewAI, LangGraph | No | Can't decorate framework internals |
| Pydantic AI, AutoGen, LlamaIndex | No | Can't decorate framework internals |
| Single raw SDK function | Optional | Nice for tracing |
| Multiple raw SDK functions | Recommended | Distinguish which function made each call |
| Want provider safeguard | Yes | Catches accidental wrong-provider calls |
Linking framework agents¶
For agents built with frameworks like LangChain or CrewAI, link the entry point only:
# LangChain
@link_agent # Uses function name
def run_langchain_agent(input: str) -> str:
agent = create_langchain_agent()
return agent.invoke(input)
# CrewAI
@link_agent("CrewAIAgent") # Explicit name
def run_crew(task: str) -> str:
crew = Crew(agents=[...], tasks=[...])
return crew.kickoff()
Reference¶
Linking utilities for agents, LLMs, and tools.
Provides decorators for registering functions, classes, and framework objects with the Tenro system for testing and observability.
link_agent
¶
link_agent(name: str | Callable[..., Any] | None = None, *, entry_points: str | list[str] | None = None) -> Callable[..., Any]
Decorator to register agent functions, classes, or objects with Tenro.
When a Construct is active, the decorator records an agent span. Otherwise, the function/method executes normally. Set TENRO_LINKING_ENABLED=false to disable decorator wrapping and return the original target unchanged.
Supports: - Sync and async functions - Classes with auto-detected entry methods (or explicit via entry_points=) - Framework objects (patches invoke/run methods)
For classes, the decorator wraps ALL matching entry methods with a re-entrancy guard, so method delegation (e.g., invoke → stream) creates only one span.
Can be used with or without parentheses: - @link_agent - @link_agent() - @link_agent("CustomName")
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str | Callable[..., Any] | None
|
Agent name for the span. If None, uses function/class name. Can also be the target itself when used without parentheses. |
None
|
entry_points
|
str | list[str] | None
|
For classes only. Explicit method name(s) to wrap. Can be a single string or list of strings. If None, auto-detects common entry methods (run, invoke, execute, call, stream, etc.). |
None
|
Returns:
| Type | Description |
|---|---|
Callable[..., Any]
|
Decorated target that registers with active Construct. |
Raises:
| Type | Description |
|---|---|
ValueError
|
If decorating a class and no entry method is found. |
Examples:
>>> @link_agent
... def simple_agent(task: str) -> str:
... return "done"
>>>
>>> @link_agent("PlannerBot")
... def plan_trip(destination: str) -> str:
... return agent.run(destination)
>>>
>>> @link_agent("WriterAgent")
... class WriterAgent:
... async def execute(self, prompt: str) -> str:
... return "result"
>>>
>>> @link_agent("MultiEntry", entry_points=["run", "stream"])
... class MultiEntryAgent:
... def run(self, task: str) -> str: ...
... def stream(self, task: str) -> Iterator[str]: ...
Source code in tenro/linking/decorators.py
1827 1828 1829 1830 1831 1832 1833 1834 1835 1836 1837 1838 1839 1840 1841 1842 1843 1844 1845 1846 1847 1848 1849 1850 1851 1852 1853 1854 1855 1856 1857 1858 1859 1860 1861 1862 1863 1864 1865 1866 1867 1868 1869 1870 1871 1872 1873 1874 1875 1876 1877 1878 1879 1880 1881 1882 1883 1884 1885 1886 1887 1888 1889 1890 1891 1892 1893 1894 1895 1896 1897 1898 1899 1900 1901 1902 1903 1904 1905 1906 1907 1908 1909 1910 1911 | |
link_llm
¶
link_llm(provider: str | Callable[..., Any] | None = None, model: str | None = None) -> Callable[..., Any]
Decorator to mark functions as LLM call boundaries.
Creates an LLMScope (transparent annotation span) when a Construct is active. HTTP interception will create LLMCall spans inside this scope. The scope captures caller info for debugging but is transparent for parent attribution.
The provider can be specified explicitly or inferred automatically from HTTP interception (URL pattern matching) or simulation configuration.
Set TENRO_LINKING_ENABLED=false to disable decorator wrapping and return the original function unchanged.
Can be used with or without parentheses: - @link_llm - @link_llm() - @link_llm(Provider.OPENAI) - @link_llm(provider=Provider.OPENAI, model="gpt-4")
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
provider
|
str | Callable[..., Any] | None
|
LLM provider (e.g., Provider.OPENAI, Provider.ANTHROPIC), or the decorated function when used without parentheses. If None, provider is inferred from HTTP interception. |
None
|
model
|
str | None
|
Model identifier (e.g., "gpt-4", "claude-3"). |
None
|
Returns:
| Type | Description |
|---|---|
Callable[..., Any]
|
Decorated function that creates LLMScope when Construct is active. |
Examples:
>>> @link_llm # Provider inferred from HTTP call
... def call_llm(prompt: str) -> str:
... return client.chat.completions.create(...)
>>>
>>> @link_llm(Provider.OPENAI, model="gpt-4") # Explicit provider
... def call_openai(prompt: str) -> str:
... return openai_client.chat.completions.create(...)
Source code in tenro/linking/decorators.py
1925 1926 1927 1928 1929 1930 1931 1932 1933 1934 1935 1936 1937 1938 1939 1940 1941 1942 1943 1944 1945 1946 1947 1948 1949 1950 1951 1952 1953 1954 1955 1956 1957 1958 1959 1960 1961 1962 1963 1964 1965 1966 1967 1968 1969 1970 1971 1972 1973 1974 1975 1976 1977 1978 1979 1980 1981 1982 1983 1984 1985 1986 1987 1988 1989 1990 1991 1992 1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023 2024 2025 2026 2027 2028 2029 2030 2031 2032 2033 2034 2035 2036 2037 2038 2039 2040 2041 2042 2043 2044 2045 2046 2047 2048 2049 2050 2051 2052 2053 2054 2055 2056 2057 2058 2059 2060 2061 2062 2063 2064 2065 2066 2067 2068 2069 2070 2071 2072 2073 2074 2075 2076 2077 2078 2079 2080 2081 2082 2083 2084 2085 2086 2087 2088 2089 2090 2091 2092 2093 2094 2095 2096 2097 2098 2099 2100 2101 2102 2103 2104 2105 2106 2107 2108 2109 2110 2111 2112 2113 2114 2115 2116 2117 2118 2119 2120 2121 2122 2123 2124 2125 2126 2127 2128 2129 2130 2131 2132 2133 2134 2135 2136 2137 2138 2139 2140 2141 2142 2143 2144 2145 2146 2147 2148 2149 2150 2151 2152 2153 2154 2155 2156 2157 2158 2159 2160 2161 2162 2163 2164 2165 2166 2167 2168 2169 2170 2171 2172 2173 2174 2175 2176 2177 2178 2179 2180 2181 2182 2183 2184 2185 2186 2187 2188 2189 2190 2191 2192 2193 2194 2195 2196 2197 2198 2199 2200 2201 2202 2203 2204 2205 2206 2207 2208 2209 2210 2211 2212 2213 2214 2215 2216 2217 2218 2219 2220 2221 2222 2223 2224 2225 2226 2227 2228 2229 2230 2231 2232 2233 2234 2235 2236 2237 2238 2239 2240 2241 2242 2243 2244 2245 2246 2247 2248 2249 2250 2251 | |
link_tool
¶
link_tool(name: str | Callable[..., Any] | None = None, *, entry_points: str | list[str] | None = None) -> Callable[..., Any]
Decorator to register tool functions, classes, or objects with Tenro.
When a Construct is active, the decorator records a tool span. Otherwise, the function executes normally. Set TENRO_LINKING_ENABLED=false to disable decorator wrapping and return the original target unchanged.
At decoration time (import), registers tool to GlobalDeclaredRegistry for attack surface tracking and coverage calculation.
Supports: - Sync and async functions - Classes with auto-detected entry methods (or explicit via entry_points=) - Framework objects (patches invoke/run methods)
Can be used with or without parentheses: - @link_tool - @link_tool() - @link_tool("CustomName")
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
name
|
str | Callable[..., Any] | None
|
Tool name for the span. If None, uses function/class name. Can also be the target itself when used without parentheses. |
None
|
entry_points
|
str | list[str] | None
|
For classes only. Explicit method name(s) to wrap. Can be a single string or list of strings. If None, auto-detects common entry methods (run, invoke, execute, call, etc.). |
None
|
Returns:
| Type | Description |
|---|---|
Callable[..., Any]
|
Decorated target that registers with active Construct. |
Examples:
>>> @link_tool
... def simple_tool(query: str) -> str:
... return "result"
>>>
>>> @link_tool("search")
... def search(query: str) -> list[str]:
... return ["result1", "result2"]
>>>
>>> @link_tool("calculator")
... class Calculator:
... def invoke(self, a: int, b: int) -> int:
... return a + b
>>>
>>> @link_tool("multi_tool", entry_points=["search", "fetch"])
... class MultiTool:
... def search(self, q: str) -> list[str]: ...
... def fetch(self, url: str) -> str: ...
Source code in tenro/linking/decorators.py
2266 2267 2268 2269 2270 2271 2272 2273 2274 2275 2276 2277 2278 2279 2280 2281 2282 2283 2284 2285 2286 2287 2288 2289 2290 2291 2292 2293 2294 2295 2296 2297 2298 2299 2300 2301 2302 2303 2304 2305 2306 2307 2308 2309 2310 2311 2312 2313 2314 2315 2316 2317 2318 2319 2320 2321 2322 2323 2324 2325 2326 2327 2328 2329 2330 2331 2332 2333 2334 2335 2336 2337 2338 2339 2340 2341 2342 2343 2344 2345 | |