-
Notifications
You must be signed in to change notification settings - Fork 36
Add /json/list and /json/new CDP proxy endpoints #116
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
Proxy these Chrome DevTools Protocol JSON endpoints to the upstream Chromium instance: - /json/list - List all browser targets (pages, extensions, workers) - /json/new - Create a new browser target This enables clients to query browser state via CDP without direct access to the internal Chromium port. Useful for verifying extensions are loaded, inspecting open pages, etc. Includes 10-second timeout on upstream requests to prevent hangs.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks good overall — simple proxy that keeps the surface area tight by hardcoding the upstream paths.
A couple small robustness tweaks I left inline:
- Use a context-aware upstream request (
NewRequestWithContext+client.Do) so cancels propagate and you have an easy hook for headers. - Preserve upstream
Content-Typewhen present, and logio.Copyerrors (these happen in practice on disconnects). - Log the
url.Parsefailure so diagnosing bad upstream state isn’t just a generic 500.
| Path: path, | ||
| RawQuery: r.URL.RawQuery, | ||
| } | ||
| resp, err := client.Get(httpURL.String()) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Minor: client.Get won’t tie the upstream request to r.Context() (so it won’t cancel if the client disconnects / request times out earlier). Using a context-aware request also makes it easier to add headers later if needed.
| resp, err := client.Get(httpURL.String()) | |
| req, err := http.NewRequestWithContext(r.Context(), http.MethodGet, httpURL.String(), nil) | |
| if err != nil { | |
| slogger.Error("failed to build upstream request", "path", path, "err", err) | |
| http.Error(w, "failed to build upstream request", http.StatusInternalServerError) | |
| return | |
| } | |
| resp, err := client.Do(req) |
| w.Header().Set("Content-Type", "application/json") | ||
| w.WriteHeader(resp.StatusCode) | ||
| io.Copy(w, resp.Body) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Two small things here: (1) consider preserving the upstream Content-Type instead of forcing JSON, and (2) io.Copy can fail (client hangup / upstream read error) so it’s worth at least logging the error.
| w.Header().Set("Content-Type", "application/json") | |
| w.WriteHeader(resp.StatusCode) | |
| io.Copy(w, resp.Body) | |
| if ct := resp.Header.Get("Content-Type"); ct != "" { | |
| w.Header().Set("Content-Type", ct) | |
| } else { | |
| w.Header().Set("Content-Type", "application/json") | |
| } | |
| w.WriteHeader(resp.StatusCode) | |
| if _, err := io.Copy(w, resp.Body); err != nil { | |
| slogger.Error("failed to proxy CDP response body", "path", path, "err", err) | |
| } |
| if err != nil { | ||
| http.Error(w, "invalid upstream URL", http.StatusInternalServerError) | ||
| return | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Might be useful to log the parse error here (right now the client gets a 500 but you lose the underlying url.Parse context in logs).
| if err != nil { | |
| http.Error(w, "invalid upstream URL", http.StatusInternalServerError) | |
| return | |
| } | |
| if err != nil { | |
| slogger.Error("failed to parse upstream URL", "err", err) | |
| http.Error(w, "invalid upstream URL", http.StatusInternalServerError) | |
| return | |
| } |
| defer resp.Body.Close() | ||
| w.Header().Set("Content-Type", "application/json") | ||
| w.WriteHeader(resp.StatusCode) | ||
| io.Copy(w, resp.Body) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Proxied CDP responses expose unusable internal WebSocket URLs
Medium Severity
The /json/list and /json/new endpoints proxy raw upstream responses containing webSocketDebuggerUrl fields that point to internal Chromium addresses (e.g., ws://localhost:9223/...). External clients cannot reach these internal URLs. This is inconsistent with /json/version, which correctly rewrites the URL to use r.Host so clients connect back through the proxy. Additionally, the WebSocket proxy ignores the request path and always connects to the browser target, so even rewritten page-specific URLs wouldn't work as expected.
Summary
Proxy additional Chrome DevTools Protocol JSON endpoints to the upstream Chromium instance:
/json/list- List all browser targets (pages, extensions, service workers)/json/new- Create a new browser target (tab)Why
This enables clients to query browser state via CDP without direct access to the internal Chromium port. Useful for:
Changes
cdpProxyHandlerhelper function that proxies requests to upstream Chromium/json/listand/json/newroutesTest Plan
/json/listreturns target list/json/new?url=...creates new tabNote
Proxies additional Chrome DevTools Protocol JSON endpoints through the existing devtools server.
cdpProxyHandlerthat forwards requests to the upstream Chromium HTTP endpoint with a 10s timeout, preserving query params, status code, and response bodyGET /json/listandGET /json/newonrDevtoolsto expose target listing and new-tab creation via the proxyWritten by Cursor Bugbot for commit 66572c0. This will update automatically on new commits. Configure here.