Skip to content

Conversation

@KapiX
Copy link
Member

@KapiX KapiX commented Dec 16, 2025

  • fillRect needs to draw a bitmap instead of using FillRect because the latter is not affected by blending modes.
  • Removed PushState and PopState calls from most places because they reset transform too.
  • Regular shadows work, inset shadows do not, fillPath WIP.
  • This isn't ready for production because most sites still render half-transparent rectangles in place of shadows.

This is still WIP but review welcome. :)

* fillRect needs to draw a bitmap instead of using FillRect because
  the latter is not affected by blending modes.
* Removed PushState and PopState calls from most places because they
  reset transform too.
* Regular shadows work, inset shadows do not, fillPath WIP.
* This isn't ready for production because most sites still render
  half-transparent rectangles in place of shadows.
Copy link
Member

@pulkomandy pulkomandy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for working on this!

I have two comments about possible performance concerns, but that may not be so important. Let's get things working right, and then we can figure out how to optimize it and maybe move some processing to app_server.

Did you try to run webkit test suite on this? I have not tried it in a while, so it's probably broken in many interesting ways. I'll see if I can get it fixed a bit, as it may be useful to gauge the progress on this.

I'm afraid I don't really have eny helpful comments on how this is all supposed to work. There is no documentation and the only reference is the other implementations. Maybe WebKit Slack chat can be helpful if you find the right person in the right timezone to answer you.

contextShadow.drawRectShadow(*this, FloatRoundedRect(rect));
HGTRACE(("hasDropShadow end\n"));
}
// FillRect doesn't respect blending modes, DrawBitmap does
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That seems like a bug in FillRect, could it be fixed there?

I'm worried that this solution is inefficient compared to just sending a FillRect command to app_server.

Copy link
Member Author

@KapiX KapiX Dec 16, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think so, BeBook says:

B_ALPHA_COMPOSITE

Used when blending two or more transparent images together offscreen, to produce a new transparent image that will later be drawn onscreen using the B_ALPHA_OVERLAY setting.

I tried to make it as efficient as possible - small bitmap is used and the tiling happens on app_server side.

EDIT: It could perhaps only be enabled when generating the shadow, but I figured DrawTiledBitmap is not so bad performance-wise.

BRect pathBounds = BRect(0, 0, rect.width() + 1, rect.height() + 1);
BBitmap *pathBitmap = new BBitmap(pathBounds, B_RGBA32, true, false);
if(m_painter == nullptr) {
m_painter = new BView(pathBounds, "painter", B_FOLLOW_ALL, B_WILL_DRAW);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

creating a bitmap and a view is a costly operation (it spawns a thread in app_server at least). Maybe we want to put some caching in place to recycle the bitmap for multiple draing operations if that is needed.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

View is cached, bitmap is not. I think we should make a bitmap pool though (with reference counting and freeing after timeout), then Async could be used again (hold the bitmap only a little longer until the frame is rendered).

@KapiX
Copy link
Member Author

KapiX commented Dec 16, 2025

Thanks for working on this!

I have two comments about possible performance concerns, but that may not be so important. Let's get things working right, and then we can figure out how to optimize it and maybe move some processing to app_server.

What could help is app_server's transparency layers supporting composition mode as a parameter.
Currently they hardcode B_ALPHA_COMPOSITE, but since there are more modes available now it could easily be changed.
GraphicsContext even has beginTransparencyLayer overload with CompositionOperator as a parameter.

Did you try to run webkit test suite on this? I have not tried it in a while, so it's probably broken in many interesting ways. I'll see if I can get it fixed a bit, as it may be useful to gauge the progress on this.

Not on this change, but I did try running it. It went through 10000 tests before app_server (?) crashed :)
I will try to make it work too after I'm done with this. Currently I'm fine working with custom html files for testing :)

I'm afraid I don't really have eny helpful comments on how this is all supposed to work. There is no documentation and the only reference is the other implementations. Maybe WebKit Slack chat can be helpful if you find the right person in the right timezone to answer you.

I actually hoped you could tell me more about how HaikuGraphicsContext is wired, especially the main viewport. I seem to remember there was one big offscreen bitmap, but maybe that's no longer the case?

@pulkomandy
Copy link
Member

Not on this change, but I did try running it. It went through 10000 tests before app_server (?) crashed :)

I think one of the tests (or several of the tests) create too many bitmaps with views, and the system runs out of ports. Running less tests in parallel may help.

I actually hoped you could tell me more about how HaikuGraphicsContext is wired, especially the main viewport. I seem to remember there was one big offscreen bitmap, but maybe that's no longer the case?

It's still the case, but that is done in WebKitLegacy, not in GraphicsContext itself: https://github.com/haiku/haikuwebkit/blob/haiku/Source/WebKitLegacy/haiku/API/WebView.cpp#L74

For WebKit2 the setup is a bit different: the bitmap is created in the WebProcess, rendered by GraphicsContext, and then sent to the web browser process where it is displayed. That is currently only in the haiku-webkit2 branch where there is a SharedBitmapHaiku implementation to handle it.

In theory there's nothing preventing from using GraphicsContext directly to render on-screen. But that would only work in WebKitLegacy and probably have a lot of flickering as the rendering is constructed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

3 participants