Surprise Launch week – Day 3 - LangChain + LlamaIndex support is here!

Surprise Launch week – Day 3 - LangChain + LlamaIndex support is here!

Today we’re releasing two small Python packages that let you trigger Skyvern browser tasks directly from LangChain or LlamaIndex code.

  • skyvern-langchain – LangChain Tool + Agent helpers
  • skyvern-llamaindex – LlamaIndex Tool + Runnable helpers

Both follow the same pattern: create a task, wait for it if you like, or hand it off to Skyvern Cloud. Nothing else to set up.


Installation

pip install skyvern-langchain            # LangChain
pip install skyvern-llamaindex           # LlamaIndex

Minimal LangChain example

import asyncio
from skyvern_langchain.agent import RunTask  # local, blocking

async def main():
    result = await RunTask().ainvoke(
        "Navigate to Hacker News and list the top 3 posts."
    )
    print(result)

asyncio.run(main())

Running against the cloud (returns immediately):

from skyvern_langchain.client import DispatchTask
task_id = await DispatchTask(api_key="sk-...").ainvoke(
    "Navigate to Hacker News and list the top 3 posts."
)

If you need agent-style reasoning, initialise an agent with the supplied tools:

from langchain.agents import initialize_agent, AgentType
from skyvern_langchain.agent import DispatchTask, GetTask
agent = initialize_agent(
    tools=[DispatchTask(), GetTask()],
    llm=ChatOpenAI(model="gpt-4o"),
    agent=AgentType.STRUCTURED_CHAT_ZERO_SHOT_REACT_DESCRIPTION,
)

Minimal LlamaIndex example

skyvern_tool = SkyvernTool()

agent = OpenAIAgent.from_tools(
    tools=[skyvern_tool.run_task()],
    llm=OpenAI(model="gpt-4o"),
    verbose=True,
)

response = agent.chat("Navigate to the Hacker News homepage and get the top 3 posts.")

The packages are early but functional; feedback or bug reports are very welcome.

  • LangChain adapter repo → skyvern-langchain on GitHub
  • LlamaIndex adapter repo → skyvern-llamaindex on GitHub
  • Docs have a few more examples and edge-case notes.

Thanks for taking a look—let us know what you think.