A Research Agent powered by Resonate and OpenAI, running on Google Cloud Functions. The Research Agent is a distributed, recursive agent that breaks a research topic into subtopics, researches each subtopic recursively, and synthesizes the results.
This example demonstrates how complex, distributed agentic applications can be implemented with simple code in Resonate's Distributed Async Await: The research agent is a recursive generator function that breaks down topics into subtopics and invokes itself for each subtopic:
function* research(ctx, topic, depth) {
const messages = [
{ role: "system", content: "Break topics into subtopics..." },
{ role: "user", content: `Research ${topic}` }
];
while (true) {
// Ask the LLM about the topic
const response = yield* ctx.run(prompt, messages, ...);
messages.push(response);
// If LLM wants to research subtopics...
if (response.tool_calls) {
const handles = [];
// Spawn parallel research for each subtopic
for (const tool_call of response.tool_calls) {
const subtopic = ...;
const handle = yield* ctx.beginRpc(research, subtopic, depth - 1);
handles.push([tool_call, handle]);
}
// Wait for all subtopic results
for (const [tool_call, handle] of handles) {
const result = yield* handle;
messages.push({ role: "tool", ..., content: result });
}
} else {
// LLM provided final summary
return response.content;
}
}
}The following video visualizes how this recursive pattern creates a dynamic call graph, spawning parallel research branches that fan out as topics are decomposed, then fan back in as results are synthesized:
call-graph.mov
Key concepts:
- Concurrent Execution: Multiple subtopics are researched concurrently via
ctx.beginRpc - Coordination: Handles are collected first, then awaited together (fork/join, fan-out/fan-in)
- Depth control: Recursion stops when
depthreaches 0
You can run the Deep Research Agent locally on your machine with Google's Functions Framework or you can deploy the agent to Google Cloud Platform.
Install the Resonate Server & CLI with Homebrew or download the latest release from Github.
brew install resonatehq/tap/resonate
To run this project you also need an OpenAI API Key and export the key as an environment variable
export OPENAI_API_KEY="sk-..."
Start the Resonate Server. By default, the Resonate Server will listen at http://localhost:8001.
resonate dev
Clone the repository
git clone https://github.com/resonatehq-examples/example-openai-deep-research-agent-gcp-ts
cd example-openai-deep-research-agent-gcp-ts
Install dependencies
npm install
Start the Google Cloud Function. By default, the Google Cloud Function will listen at http://localhost:8080.
npm run dev
Start a research task
resonate invoke <promise-id> --func research --arg <topic> --arg <depth> --target <function-url>
Example
resonate invoke research.1 --func research --arg "What are distributed systems" --arg 1 --target http://localhost:8080
Use the resonate tree command to visualize the research execution.
resonate tree research.1
This section guides you through deploying the Deep Research Agent to Google Cloud Platform using Cloud Run for the Resonate server and Cloud Functions for the research function.
Install the Resonate CLI with Homebrew or download the latest release from Github.
brew install resonatehq/tap/resonate
To run this project you need an OpenAI API Key.
Ensure you have a Google Cloud Platform account, a project, and the necessary permissions as well as the Google Cloud CLI installed and configured.
Warning
Google Cloud Platform offers extensive configuration options. The instructions in this guide provide a baseline setup that you will need to adapt for your specific requirements, organizational policies, or security constraints.
Deploy the Resonate Server with its initial configuration.
Step 1: Initial deployment
gcloud run deploy resonate-server \
--image=resonatehqio/resonate:latest \
--region=us-central1 \
--allow-unauthenticated \
--timeout=300 \
--port=8080 \
--min-instances=1 \
--max-instances=1 \
--args="serve,--api-http-addr=:8080"
Step 2: Configuration
Configure the Resonate Server with its URL.
export RESONATE_URL=$(gcloud run services describe resonate-server --region=us-central1 --format='value(status.url)')
gcloud run services update resonate-server \
--region=us-central1 \
--args="serve,--api-http-addr=:8080,--system-url=$RESONATE_URL"
Print the Resonate Server URL
echo $RESONATE_URL
To view the Resonate Server logs
gcloud run services logs read resonate-server --region=us-central1 --limit=50
Deploy the research function to Cloud Functions.
gcloud functions deploy research \
--gen2 \
--runtime=nodejs20 \
--region=us-central1 \
--source=. \
--entry-point=handler \
--trigger-http \
--allow-unauthenticated \
--set-env-vars OPENAI_API_KEY=sk-...
Get the function URL:
export FUNCTION_URL=$(gcloud functions describe research --gen2 --region=us-central1 --format='value(serviceConfig.uri)')
Print the Cloud Function URL
echo $FUNCTION_URL
To view the Cloud Function logs
gcloud functions logs read research --gen2 --region=us-central1 --limit=50
Start a research task
resonate invoke <promise-id> --func research --arg <topic> --arg <depth> --target $FUNCTION_URL --server $RESONATE_URL
Example
resonate invoke research.1 --func research --arg "What are distributed systems" --arg 1 --target $FUNCTION_URL --server $RESONATE_URL
Use the resonate tree command to visualize the research execution.
resonate tree research.1 --server $RESONATE_URL
Delete the deployed services:
# Delete the Cloud Function
gcloud functions delete research --gen2 --region=us-central1 --quiet
# Delete the Cloud Run service
gcloud run services delete resonate-server --region=us-central1 --quiet
The Deep Research Agent depends on OpenAI and the OpenAI TypeScript and JavaScript SDK. If you are having trouble, verify that your OpenAI credentials are configured correctly and the model is accessible by running the following command in the project's directory:
node -e "import OpenAI from 'openai'; const client = new OpenAI({ apiKey: process.env.OPENAI_API_KEY }); (async () => { const res = await client.chat.completions.create({ model: 'gpt-5', messages: [{ role: 'user', content: 'knock knock' }] }); console.log(res.choices[0].message); })();"
If everything is configured correctly, you will see a response from OpenAI such as:
{ role: 'assistant', content: "Who's there?", refusal: null, annotations: []}
If you are still having trouble, please open an issue.
