Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Why browsers are bloated

Name: Anonymous 2014-07-27 0:20

https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/Scrollbar.cpp
https://github.com/WebKit/webkit/blob/master/Source/WebCore/platform/win/ScrollbarThemeWin.cpp
Let's reinvent the fucking scrollbar, which every goddamn platform with a UI already has, and make it behave subtly different from the native one!

Right-click a native scrollbar in some other app:
- Scroll Here
- Top
- Bottom
- Page Up
- Page Down
- Scroll Up
- Scroll Down

Right-click a scrollbar in Chrome:
- Back
- Forward
- Reload
- Save As...
...

Right-click a scrollbar in Firefox and Opera:
Absolutely fucking nothing happens!

What the fuck!? How did these terminally retarded idiots get involved in creating one of the most important pieces of software to the average user?

Name: Anonymous 2014-11-17 19:45

I think some of your design goals are at odds with each other, Cudder. GDI is ancient shit and it's only going to make your browser harder to port. A D3D or OpenGL renderer would draw significantly faster and allow you to easily render most of the page at once, significantly improving scroll performance (just a blit).

Leaning too heavily on Windows features in general is going to kill any chance of this making waves in the web "developer" community. They're all on OSX and Linux these days.

Name: Anonymous 2014-11-17 20:03

>>278
loop array_fold_loop
U MENA DEC RCX FOLLOWED BY JNZ array_fold_loop ???!??!?

Name: Anonymous 2014-11-17 20:26

>>281
GDI is ancient shit
Damn right. The Brotherhood of NOD is where it's at.

Name: Anonymous 2014-11-17 20:50

>>274
Do you like to eat shit, then? Because billions of flies can't be wrong.

Name: Cudder !MhMRSATORI 2014-11-17 21:49

>>280
Programmers don't give a shit about "bytes", "bits", "cycles" or any of that sewer infrastructure that has the honor of running programs.
Those are not true programmers, they're the head-in-the-sky academics who are so far from knowing what reality is that all they can do is come up with more resource-wasting "solutions" to non-problems.

Because wasting time on floor-scrubbery means having less time to actually write useful programs.
Say that when you figure out how much time you're wasting just waiting for your ultra-bloated system to do the stuff you want to do... now multiply that by everyone else using the same softare... because the id10ts who wrote the software you use followed the same entitled "my time is so precious I'd rather waste YOUR time and money" mentality... fucking retard.

>>281
Ancient but simple and works. I don't need 3D capability anyway. Maybe D3D or OpenGL could be faster, but I don't care because I don't think it's needed and it's quite a step up in complexity. I just want to see the faces on those mainstream browser developers when a pure and simple GDI-based solution blows away their bloated renderer that needs a several-times-faster GPU...)

For example, compare how to do something that should be simple in OpenGL, like rendering some vector graphics, and you find http://gamedev.stackexchange.com/questions/48034/how-do-i-render-a-filled-and-stroked-path-using-opengl which basically says "it's too hard", and the ONE function that GDI can do it with: http://msdn.microsoft.com/en-us/library/windows/desktop/dd145123(v=vs.85).aspx

As for porting, not so concerned - libXt/Xaw or whatever it's called on *nix should provide equivalent functionality on any system with X11. I don't care how hard it is to port because chances are if I want it to be efficient on a specific platform I'm going to have to use platform-specific things and end up rewriting a lot of the code to take advantage of that. Look at something like Dillo for example - I haven't tested it much on *nix, but on Windows it's far more bloated because it has to drag along its own *nix GUI libraries and doesn't even feel like a native application. Forget about making a lowest-common-denominator monstrosity that performs half-decently on a ton of platforms, concentrate on one platform first and others afterward.

Name: Anonymous 2014-11-17 23:11

>>285
Why do you insist on academics being the ones who care about ``understandable programs''? They're exactly the ones who care about microoptimizations and make programs unreadable and full of abstract bullshite.

Are you sure you're not mixing them up with the ENTERPRISE crowd?

Name: Anonymous 2014-11-17 23:41

I just want to see the faces on those mainstream browser developers when a pure and simple GDI-based solution blows away their bloated renderer that needs a several-times-faster GPU...
IIRC GDI is partly hardware accelerated (poorly, since it was never meant to be used that way) since Vista, so I can't see a GDI renderer ever beating a renderer that uses GL/D3D directly.

For example, compare how to do something that should be simple in OpenGL, like rendering some vector graphics
I thought you weren't going to implement the fancy shit like vector graphics and SVG?
The fact that you think bézier curves would be "the one thing that should be simple to implement in OpenGL" shows that you don't have any understanding of the graphics pipeline. GPUs are not magic graphics pixie dust, they're machines that eat vertices and shit pixels.
You can render them in multiple ways with OpenGL; on newer cards, tessellation shaders (and to a lesser extent, geometry shaders) are basically designed for turning bézier surfaces, the 3D equivalent of bézier curves, into triangles. So you could easily do the same thing in 2D. On older cards, you could do the curve->lines->triangles conversion yourself in software, or you could just put all of it on the slow path and render it all in software, like most people do for text. Point is, it's a choice that your application has to make and not something that belongs in DA STANDAD in any way.

And arbitrary path rendering and filling only gets more complicated from there. It's just not a strong suit for GPUs at all. https://en.wikipedia.org/wiki/Direct2D#Implementation

However, if you ignore the arbitrary vector SVG shit like I thought you would want to, then 99.99% of graphics rendering is going to be basic shit like rectangles, lines, circles, and scaled images, that any GPU can chew through with ease.

As for porting, not so concerned - libXt/Xaw or whatever it's called on *nix should provide equivalent functionality on any system with X11.
At least with GDI you have stability on your side, but it's pretty likely that the X11 libs are going to die in a few years when the Wayland/Mir switch happens.

but on Windows it's far more bloated because it has to drag along its own *nix GUI libraries and doesn't even feel like a native application.
There's a difference between not using a native UI for the "chrome" and not wanting the guts of your implementation tied to one platform.

Name: Anonymous 2014-11-18 0:47

>>287
The fact that you think bézier curves would be "the one thing that should be simple to implement in OpenGL" shows that you don't have any understanding of the graphics pipeline. GPUs are not magic graphics pixie dust, they're machines that eat vertices and shit pixels.

I hate to stick up for Cudder, but in the olden days we had some shitty yet remarkable support for analytic objects, which weren't described in verts. So it might just work.

However, if you ignore the arbitrary vector SVG shit like I thought you would want to, then 99.99% of graphics rendering is going to be basic shit like rectangles, lines, circles, and scaled images, that any GPU can chew through with ease.

Not so. Have you heard of floats, reflow? This shit is everywhere and it sucks. Once thats done, you sure can shit it out of a GPU pretty quick, but the GPU does the least of the work in the end so who cares? See also: https://github.com/servo/servo

Don't get me wrong, I think Cudder is crazy and GDI is awful but somehow I think that's probably the point.

Name: Anonymous 2014-11-18 2:38

I hate to stick up for Cudder, but in the olden days we had some shitty yet remarkable support for analytic objects, which weren't described in verts. So it might just work.

I believe most early graphics hardware basically consisted of one or two fast serial DSPs, so it wouldn't have been that weird for them to use a "software" rasterization method like a scanline rasterizer for splines/bézier curves/etc; this is essentially how they implemented triangle rasterization or anything else as well.

You can do anything you want with compute shaders now, but 1. it's not going to take advantage of the fixed-function hardware like the triangle rasterizer, and 2. you're not going to get good parallelism unless you need to render hundreds of bézier curves for some reason; the only legitimate usage I could think of would be high-quality GPU-accelerated font rasterization, which isn't a terrible idea come to think of it. So again, given that there are so many different ways to implement these things with their own tradeoffs, it makes little sense to bake it into an API that is supposed to be a thin shell over the hardware. The fixed-function pipeline is dead.

Aside: lol, Firefox wants to correct "GPU-accelerated" to "GNU-accelerated"

Not so. Have you heard of floats, reflow? This shit is everywhere and it sucks. Once thats done, you sure can shit it out of a GPU pretty quick, but the GPU does the least of the work in the end so who cares?

Because it can save CPU time that would be wasted on a task it isn't suited for while the GPU is mostly sitting idle, because doing very simple rendering on the GPU is going to use less power than doing the same task on the CPU, and because you have to go through the GPU at the end anyway so you might as well take advantage of it.

But hey, it won't run on grandthing Cudder's Win95 Pentium, so it's not acceptable.

Name: Anonymous 2014-11-18 3:39

I believe most early graphics hardware basically consisted of one or two fast serial DSPs,

That's not really the point. All cards can still do it (just not in "modern GL"), and the output quality is as good as the display can handle. If you're linking GL, you might as well use it.

GPU-accelerated font rasterization, which isn't a terrible idea come to think of it.

It's been done. It's neat, but it chews up your entire GPU with only a single column of text. The reflow on a modern web page (or desktop) is harsh.

Because it can save CPU time that would be wasted on a task it isn't suited for while the GPU is mostly sitting idle,

You do get a boost out of it, but it's tiny by comparison. For most of the page you have to construct the pixmaps on the CPU anyway. I don't think anyone would bother if they weren't already linking gl for use with webgl. PS. I hate to break it to you, but scrolling isn't just blitting, a lot of sites modify the DOM in scroll handlers. Yes it's stupid and I hate it but that doesn't change it.

Name: Anonymous 2014-11-18 4:28

That's not really the point. All cards can still do it (just not in "modern GL"), and the output quality is as good as the display can handle. If you're linking GL, you might as well use it.

I can pretty much guarantee you that they're emulating it on the CPU or in a shader. It may be easy to use the fixed function pipeline, but everyone's abandoned it for a reason. You can almost always do things better yourself.
Not to mention, you don't get any fixed function support on GL ES devices, nor in the newer GL or D3D contexts.

It's been done.

I'm aware of a library that precalculates all the glyphs into a texture and renders like a bitmapped font, but that's not exactly the same.

It's neat, but it chews up your entire GPU with only a single column of text. The reflow on a modern web page (or desktop) is harsh.

Eh, I don't see why i.e. FreeType would be any faster at rendering the glyphs.

You do get a boost out of it, but it's tiny by comparison.

[citation needed]

PS. I hate to break it to you, but scrolling isn't just blitting, a lot of sites modify the DOM in scroll handlers.

But Cudder-sama is ignoring those parts of the spec.

Name: Anonymous 2014-11-18 5:54

>>291
You're still missing the point. "You can almost always do things better yourself." Analytic objects? Really? They look really good in GL, for what they are at least. You're going to spend a lot of time doing a bad job of it, that's what you're going to do.

I'm aware of a library that precalculates all the glyphs into a texture and renders like a bitmapped font, but that's not exactly the same.

And not what I'm talking about. If I thought it was, I would have written it off as "looks like shit" and "we stopped using bitmapped fonts for a reason" immediately.

Eh, I don't see why i.e. FreeType would be any faster at rendering the glyphs.

Because merely rendering glyphs doesn't render text.

I forget what the exact problem was, but it was pretty fundamental. I think it had something to do with the amount of data required to represent the glyphs created a lot of tension, and used up a lot of the GPU. Sure, it was faster, but so much less efficient that it isn't really worth it to actually do it (i.e. you do want to draw graphics with your card, right?) Keep in mind you have kerning, reflow and all that stuff to worry about, and yes, you need to do it on the GPU or else you will just waste time copying data back and forth.

Someday I hope it works out well. It's not going to happen on today's hardware though. Moving on.

[citation needed]

If you really think compositing some pixmaps is so hefty when compared to rendering them on a CPU, I would love to hear all about how that's supposed to work. If you think you can do the rendering efficiently on a GPU I encourage you to try. Many people will thank you if it works.

I'll see if I can find you a citation though.

But Cudder-sama is ignoring those parts of the spec.

Fine, but: in that case you can do the same thing on the CPU. I know you still think compositing is so expensive (even when we're not doing blur and other effects, are we?) so there's probably not much in this line of discussion.

Name: Anonymous 2014-11-18 8:41

You're still missing the point. "You can almost always do things better yourself." Analytic objects? Really? They look really good in GL, for what they are at least. You're going to spend a lot of time doing a bad job of it, that's what you're going to do.

Graphics programming, motherfucker. No one cares if your result looks the same, we want you to do it as fast as possible. If you don't want to put the effort in, get the fuck out and become an HTML apper.

Because merely rendering glyphs doesn't render text.
I forget what the exact problem was, but it was pretty fundamental. I think it had something to do with the amount of data required to represent the glyphs created a lot of tension, and used up a lot of the GPU. Sure, it was faster, but so much less efficient that it isn't really worth it to actually do it (i.e. you do want to draw graphics with your card, right?) Keep in mind you have kerning, reflow and all that stuff to worry about, and yes, you need to do it on the GPU or else you will just waste time copying data back and forth.

Text rendering, not layout, kerning, etc. You don't really seem to know what you're talking about, but for the record, I'm talking about RASTERIZING GLYPHS, COMPUTING THE PIXEL COVERAGE OF BÉZIER CURVES on the GPU, and not anything else completely orthogonal to that goal.

Good fonts have kerning information built in, so that you don't need to rasterize the text to tell how the characters should line up. You just plow through the character widths+kerning info on the CPU to produce a stream of characters and their positions, put that in the pipeline, and let a shader running on hundreds of processing elements in parallel compute thousands of perfectly antialiased glyphs without breaking a sweat.

Name: Anonymous 2014-11-18 9:07

>>293
Dude, you're arguing against using a pixel perfect, debugged, optimized function of a library you're already linking in favour of rolling your own, for a task that's easy to fuck up.

Text rendering, not layout, kerning, etc.
It's not rendered text until you've done the rest. Sure, if you want to render glyphs, go ahead. They're pefectly usable individual glyphs that have no place being next to one another. If that's all you want, it's fine... and has nothing to do with anything.

Name: Anonymous 2014-11-19 7:08

this thread perfectly encapsulates all of our disfunctional behaviors.

Name: Anonymous 2014-11-19 7:24

>>295
ENCAPSULATE MY ANUS

Name: Anonymous 2014-11-19 7:26

Analytic objects?

Pffsh.

Big deal.

Man, I'm hungry.

Maybe I can get some pussy outside.

(movement begins to occur in my pants) grumble grumble (something is happening!) ROARRRR ROARRRRR ROARRRRRRRRRRRRRRRR!

Settle! Down, boy!

Name: Anonymous 2014-11-19 7:34

>>297
The reactions of our generation:

Ben Bitdiddle: Who's gonna walk outside with an erection lke that? hahaha

Alyssa P. Hacker: OMG! That poor kitten!

Lem E. Tweakit: That's not how you train a fucking dog, here, let me show you how! *Grabs the dog ...*

Name: Anonymous 2014-11-19 15:38

>>298
That's not how you fuck a training dog
FTFY

Name: Anonymous 2014-11-19 15:52

Alyssa is a dump ugly slut, Eva is much smarter and cute.

Name: Anonymous 2014-11-19 23:36

So, like we have a Fossil Repo now...:
http://progrider.org/fossil/login?g=/fossil/home

Why don't we upload the first mock up of the browser?

Name: Anonymous 2014-11-20 4:51

>>301
The browser doesn't fucking belong to you. Write your own.

Name: Anonymous 2014-11-21 0:37

>>302
You're so cute when you're mad cudder :3

Name: Anonymous 2014-11-21 2:30

>>303
Really though, >>302-kun is right, >>301-chan was being very presumptuous. There is no we or our on this, it's Cudder's browser, and if I know Cudder (which I do), she is never going to release it. Just like that softice killer, and that decompiler that she swears would destroy the industry if she released it. I'm sure her browser is very good and very fast (Cudder is actually very skilled), but it is going to stay on her hard drive until the end of time.

Anyway, we don't really have board projects. We all just do our own thing and come here to talk down to one another. If it were like it is at, say, /g/ or reddit, where everyone is presumed to be equals, there would be one person doing actual work, while a thousand others bitched and complained at him for not doing things they way they liked, or for not supporting their obscure bullshit platform. Either that, or there would be no work done at all (because they are talentless nitwits, but that's neither here nor there) and everyone would just argue about the name and logo.

Moving on to the next point in this frivolous and unnecessary post that will further clog up Cudder's vaporware thread, we probably should use the fossil repo more. Admin-kike was very nice to provide it. I tried for about five minutes to set it up to work through Tor, but got bored and started fapping to Suwako. I guess I can try again and put something in it.

Name: Anonymous 2014-11-21 2:38

>>304
I would put something up on it, but my projects will also rot on my various hard drives and usb sticks until they are destroyed.

Name: >>305 2014-11-21 2:41

However I would be open to sending copies of my code to some people here. And maybe even collaborating on something. But I would want the collaboration closed to just us. I don't want my gang stalkers to be able to find it, or infiltrate the group later by just clicking their way in.

Name: Anonymous 2014-11-21 10:10

>>306
Just pretend it isn't yours.
Put the project up there and continue updating (and accepting collaborating). Nobody needs to know the author.

Name: Anonymous 2014-11-21 11:17

>>307
I can pretend all I want that it isn't mine, but a part of my being will be in the project, and I'm not willing to share this with my gang stalkers.

Name: Anonymous 2014-11-21 16:16

>>308
Sounds like vaporware.
Will it ever be released as open-source?
I thought you were cool.

Name: Anonymous 2014-11-21 19:27

>>309
It puts me in a tough situation. I want all information to be free and I want to share my creations with the world. But there are also some terrible people in the world. For my own sake I want to deprive these people of knowledge about myself. So my contributions to the open net are limited.

Have you ever witnessed what happens when you are linked to your online life and you piss off some people? Cops going through your facebook posts and interpreting obvious satire as criminal extremism. Angry nerds finding your home address and ordering pizzas. Angry nerds finding out the route you use when you go for a run.

But regardless, you're a retard for judging the character of a person by whether or not they open source their software, RMS cultist.

Name: Anonymous 2014-11-21 21:42

I want all information to be free and I want to share my creations with the world
Nice

But there are also some terrible people in the world.
There are.

For my own sake I want to deprive these people of knowledge about myself
Do you consider a minimalist web browser to be "knowledge about yourself"?

So my contributions to the open net are limited.
What the hell?!

Have you ever witnessed what happens when you are linked to your online life and you piss off some people?
Yes, but what does that have to do with anything? Will your browser piss someone off? Will they link it to your real identity? Why the hell don't you just start contributing anonymously, without letting people know it's you?

Crazy bastard.

Name: Anonymous 2014-11-21 23:40

>>311
I'm not cudder, just so you know.

Do you consider [personal project] to be "knowledge about yourself"?
Yes. Any material about yourself online can be viewed and interpreted. A possible employer could look at my online work and come to some irrational conclusion and place judgement. Like if I do functional programming in my free time, then I must be one of those crazy haskellers that will want to change the company's code to be pure. Or I don't know how to program imperitively. These are trivial. At worst I become a target of investigation and police acuse my open source software of being ``illegal hacking tools''. Fuck them.

What the hell?!
Why do you care anyways? It's none of your business what I do with my free time.

Why the hell don't you just start contributing anonymously, without letting people know it's you?
If people want to find out who you are they'll analyze what you've posted. See the attempts on Satoshi (bitcoin). If I contribute to a project anonymously, I have to isolate all ideas and methods used in that project. If I accidentally reuse an identifying technique in another project directly associated with me, I'm linked to it. Coding conventions like spacing and variable names are also an issue, but easier to deal with.

Why the hell don't you just start contributing anonymously, without letting people know it's you?
Why would I? Give me a reason that makes it worth the hassle.

Name: Anonymous 2014-11-21 23:43

Coding conventions like spacing and variable names are also an issue, but easier to deal with.
Do it GNU style and you'll never have a problem with people linking you to your code.

Name: Anonymous 2014-11-21 23:49

>>313
I program in languages other than portable assembler. In these you create your own constructs and these become very unique to the programmer.

Name: Anonymous 2014-11-21 23:50

>>312
Yes. Any material about yourself online can be viewed and interpreted.
Just how odd is your programming style?

See the attempts on Satoshi (bitcoin)
Precisely, nobody's been able to do shit. If pretty much the entire world is busting their ass trying to uncover an anonymous programmer, why do you think a couple of faggots trying to uncover your identity through a browser will do any better?

Coding conventions like spacing and variable names are also an issue
Unless you do stupid shit like Nigger_Dicks-In_my_anus, there are at most four major variable naming schemes, namely FaggotCase, ngrvar, faggot_var or smug-var. Great, they just reduced the search space to 25% of all programmers, which is still a fuckton.

Name: Anonymous 2014-11-21 23:51

betterCase

Name: Anonymous 2014-11-21 23:58

>>315
Just how odd is your programming style?
Evolutionary.

Precisely, nobody's been able to do shit.
As far as we know.

If someone who is trying to deanonymize me is able to read my code, I've already made a mistake. I don't want it to come down to issues like coding style.

Maybe I'll write a compiler that compiles my code to standard C and never share the original source.

Name: >>315 2014-11-22 0:06

>>316
That's an element of the FaggotCase set.

>>317
I'll write a compiler that compiles my code to standard C
Please do.

Name: Anonymous 2014-11-22 0:07

>>317
Can you put that on github, please? I'd be interested in a code anonymizer.

Name: Anonymous 2014-11-22 0:11

>>319
I'll write one to suit my needs, which are likely different from yours. But if you promise not to post it elsewhere, I'll send it to you over a secure channel or something. I don't think I'll be able to boot strap it, and run the anonymizer on itself.

Newer Posts