Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Why browsers are bloated

Name: Anonymous 2014-07-27 0:20

https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/Scrollbar.cpp
https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/win/ScrollbarThemeWin.cpp
Let's reinvent the fucking scrollbar, which every goddamn platform with a UI already has, and make it behave subtly different from the native one!

Right-click a native scrollbar in some other app:
- Scroll Here
- Top
- Bottom
- Page Up
- Page Down
- Scroll Up
- Scroll Down

Right-click a scrollbar in Chrome:
- Back
- Forward
- Reload
- Save As...
...

Right-click a scrollbar in Firefox and Opera:
Absolutely fucking nothing happens!

What the fuck!? How did these terminally retarded idiots get involved in creating one of the most important pieces of software to the average user?

Name: Anonymous 2014-11-17 23:41

I just want to see the faces on those mainstream browser developers when a pure and simple GDI-based solution blows away their bloated renderer that needs a several-times-faster GPU...
IIRC GDI is partly hardware accelerated (poorly, since it was never meant to be used that way) since Vista, so I can't see a GDI renderer ever beating a renderer that uses GL/D3D directly.

For example, compare how to do something that should be simple in OpenGL, like rendering some vector graphics
I thought you weren't going to implement the fancy shit like vector graphics and SVG?
The fact that you think bézier curves would be "the one thing that should be simple to implement in OpenGL" shows that you don't have any understanding of the graphics pipeline. GPUs are not magic graphics pixie dust, they're machines that eat vertices and shit pixels.
You can render them in multiple ways with OpenGL; on newer cards, tessellation shaders (and to a lesser extent, geometry shaders) are basically designed for turning bézier surfaces, the 3D equivalent of bézier curves, into triangles. So you could easily do the same thing in 2D. On older cards, you could do the curve->lines->triangles conversion yourself in software, or you could just put all of it on the slow path and render it all in software, like most people do for text. Point is, it's a choice that your application has to make and not something that belongs in DA STANDAD in any way.

And arbitrary path rendering and filling only gets more complicated from there. It's just not a strong suit for GPUs at all. https://en.wikipedia.org/wiki/Direct2D#Implementation

However, if you ignore the arbitrary vector SVG shit like I thought you would want to, then 99.99% of graphics rendering is going to be basic shit like rectangles, lines, circles, and scaled images, that any GPU can chew through with ease.

As for porting, not so concerned - libXt/Xaw or whatever it's called on *nix should provide equivalent functionality on any system with X11.
At least with GDI you have stability on your side, but it's pretty likely that the X11 libs are going to die in a few years when the Wayland/Mir switch happens.

but on Windows it's far more bloated because it has to drag along its own *nix GUI libraries and doesn't even feel like a native application.
There's a difference between not using a native UI for the "chrome" and not wanting the guts of your implementation tied to one platform.

Newer Posts