Skip to content

Conversation

@rgarcia
Copy link
Contributor

@rgarcia rgarcia commented Jan 13, 2026

Summary

Proxy additional Chrome DevTools Protocol JSON endpoints to the upstream Chromium instance:

  • /json/list - List all browser targets (pages, extensions, service workers)
  • /json/new - Create a new browser target (tab)

Why

This enables clients to query browser state via CDP without direct access to the internal Chromium port. Useful for:

  • Verifying extensions are loaded correctly
  • Inspecting open pages/tabs
  • Creating new tabs programmatically

Changes

  • Add cdpProxyHandler helper function that proxies requests to upstream Chromium
  • Register /json/list and /json/new routes
  • Include 10-second timeout on upstream requests to prevent hangs

Test Plan

  • Code compiles
  • Verify /json/list returns target list
  • Verify /json/new?url=... creates new tab

Note

Proxies additional Chrome DevTools Protocol JSON endpoints through the existing devtools server.

  • Add cdpProxyHandler that forwards requests to the upstream Chromium HTTP endpoint with a 10s timeout, preserving query params, status code, and response body
  • Register GET /json/list and GET /json/new on rDevtools to expose target listing and new-tab creation via the proxy

Written by Cursor Bugbot for commit 66572c0. This will update automatically on new commits. Configure here.

Proxy these Chrome DevTools Protocol JSON endpoints to the upstream
Chromium instance:

- /json/list - List all browser targets (pages, extensions, workers)
- /json/new - Create a new browser target

This enables clients to query browser state via CDP without direct
access to the internal Chromium port. Useful for verifying extensions
are loaded, inspecting open pages, etc.

Includes 10-second timeout on upstream requests to prevent hangs.
Copy link
Contributor

@tembo tembo bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good overall — simple proxy that keeps the surface area tight by hardcoding the upstream paths.

A couple small robustness tweaks I left inline:

  • Use a context-aware upstream request (NewRequestWithContext + client.Do) so cancels propagate and you have an easy hook for headers.
  • Preserve upstream Content-Type when present, and log io.Copy errors (these happen in practice on disconnects).
  • Log the url.Parse failure so diagnosing bad upstream state isn’t just a generic 500.

Path: path,
RawQuery: r.URL.RawQuery,
}
resp, err := client.Get(httpURL.String())
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Minor: client.Get won’t tie the upstream request to r.Context() (so it won’t cancel if the client disconnects / request times out earlier). Using a context-aware request also makes it easier to add headers later if needed.

Suggested change
resp, err := client.Get(httpURL.String())
req, err := http.NewRequestWithContext(r.Context(), http.MethodGet, httpURL.String(), nil)
if err != nil {
slogger.Error("failed to build upstream request", "path", path, "err", err)
http.Error(w, "failed to build upstream request", http.StatusInternalServerError)
return
}
resp, err := client.Do(req)

Comment on lines +203 to +205
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(resp.StatusCode)
io.Copy(w, resp.Body)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Two small things here: (1) consider preserving the upstream Content-Type instead of forcing JSON, and (2) io.Copy can fail (client hangup / upstream read error) so it’s worth at least logging the error.

Suggested change
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(resp.StatusCode)
io.Copy(w, resp.Body)
if ct := resp.Header.Get("Content-Type"); ct != "" {
w.Header().Set("Content-Type", ct)
} else {
w.Header().Set("Content-Type", "application/json")
}
w.WriteHeader(resp.StatusCode)
if _, err := io.Copy(w, resp.Body); err != nil {
slogger.Error("failed to proxy CDP response body", "path", path, "err", err)
}

Comment on lines +186 to +189
if err != nil {
http.Error(w, "invalid upstream URL", http.StatusInternalServerError)
return
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Might be useful to log the parse error here (right now the client gets a 500 but you lose the underlying url.Parse context in logs).

Suggested change
if err != nil {
http.Error(w, "invalid upstream URL", http.StatusInternalServerError)
return
}
if err != nil {
slogger.Error("failed to parse upstream URL", "err", err)
http.Error(w, "invalid upstream URL", http.StatusInternalServerError)
return
}

defer resp.Body.Close()
w.Header().Set("Content-Type", "application/json")
w.WriteHeader(resp.StatusCode)
io.Copy(w, resp.Body)
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Proxied CDP responses expose unusable internal WebSocket URLs

Medium Severity

The /json/list and /json/new endpoints proxy raw upstream responses containing webSocketDebuggerUrl fields that point to internal Chromium addresses (e.g., ws://localhost:9223/...). External clients cannot reach these internal URLs. This is inconsistent with /json/version, which correctly rewrites the URL to use r.Host so clients connect back through the proxy. Additionally, the WebSocket proxy ignores the request path and always connects to the browser target, so even rewritten page-specific URLs wouldn't work as expected.

Fix in Cursor Fix in Web

@rgarcia rgarcia closed this Jan 13, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants