Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Have you read your PFDS today?

Name: Anonymous 2015-11-13 21:39

Purely Functional Data Structures
http://www.cs.cmu.edu/~rwh/theses/okasaki.pdf
When a C programmer needs an efficient data structure for a particular problem, he or she can often simply look one up in any of a number of good textbooks or handbooks. Unfortunately, programmers in functional languages such as Standard ML or Haskell do not have this luxury. Although some data structures designed for imperative languages such as C can be quite easily adapted to a functional setting, most cannot, usually because they depend in crucial ways on assignments, which are disallowed, or at least discouraged, in functional languages. To address this imbalance, we describe several techniques for designing functional data structures, and numerous original data structures based on these techniques, including multiple variations of lists, queues, double-ended queues, and heaps, many supporting more exotic features such as random access or efficient catenation.

In addition, we expose the fundamental role of lazy evaluation in amortized functional data structures. Traditional methods of amortization break down when old versions of a data structure, not just the most recent, are available for further processing. This property is known as persistence, and is taken for granted in functional languages. On the surface, persistence and amortization appear to be incompatible, but we show how lazy evaluation can be used to resolve this conflict, yielding amortized data structures that are efficient even when used persistently. Turning this relationship between lazy evaluation and amortization around, the notion of amortization also provides the first practical techniques for analyzing the time requirements of non-trivial lazy programs.
 
Finally, our data structures offer numerous hints to programming language designers, illustrating the utility of combining strict and lazy evaluation in a single language, and providing non-trivial examples using polymorphic recursion and higher-order, recursive modules.

Name: Anonymous 2015-12-05 8:34

>>73
Not many people are working on that stack, but that is what I'm getting at and what I'm looking for. No, >>71 is shit.

>>71
Just because there is noise at runtime doesn't mean you can't produce programs that will on average greatly outperform other programs on average. The statement GC is or is not slow depends greatly on how you are using it and what the alternatives are. But if you can avoid it altogether, that's one less thing your program needs to spend time doing, and if you take more control over how things are allocated in memory you can arrange objects compactly to save memory and adjacently to improve cache performance.

Name: Anonymous 2015-12-05 8:35

Wait, who the fuck am I to know who is and isn't working on a lisp stack. Nevermind.

Name: Anonymous 2015-12-05 11:54

>>75
It's going to be either Lisp, Haskell or an ML in that realm. There's also C++ if you are lucky enough to be sponsored by Apple (LLVM).

Name: Anonymous 2015-12-05 12:03

>>74
The statement GC is or is not slow depends greatly on how you are using it and what the alternatives are.

In order words, the statement "GC is slow" (or "makes it run slower than it needs to" and all its variations) is incorrect.

But if you can avoid it altogether, that's one less thing your program needs to spend time doing

It's not something the programmer spends time doing, so it's moot. And because it can be as fast or faster at runtime than manual management, and is definitely faster for development time, you're a fucking pathetic idiot for discarding it.

and if you take more control over how things are allocated in memory you can arrange objects compactly to save memory and adjacently to improve cache performance.

You're wasting your goddamn time. Your life is useless, because you throw it at shit like this. People who are way smarter than you have already solved these problems, but you'll dick around with slower shit, spending months fiddling with bytes and address lines, instead of taking advantage of stuff that's better.

Again, you're a child playing with toys. Fuck off, because you have nothing mature to say.

Name: Anonymous 2015-12-05 12:30

This is why I hate the computing industry. Nothing has changed since the 1950s, when there was nothing but bits, and it was the human's job to manually route them around. People like >>74-chan are actually happy with that. That's either flagrant masochism, or ignorance about the potential of computing. Probably both.

There are the occasional people here & there who actually do perceive how much things suck with computing. However, they either don't know any better and can do nothing but just reinvent the wheel; or they get so lost up their own meta-rectum that they never actually get around to anything useful.

Computing remains at a stasis where humans serve the details of the machine, instead of the centuries-old vision of machines taking care of the details of humans.

Precisely and absolutely because of people like >>74. and cudder

Name: Anonymous 2015-12-05 12:47

the anti-GC bait is so strong.. it's because there's actually a lot of stupid programmers who actually beleive this

Name: Anonymous 2015-12-05 13:48

>>25
Do you live in a world where computer programs do not interact with humans in any way? Who gives a shit if the results come through 8µs faster
I live in a world where people are increasingly (in fact, the majority now) uses battery powered devices - extra overhead or bad performance directly translates to less battery time.

Name: Anonymous 2015-12-05 14:07

>>79
This is not an imageboard, please take your imageboard catchwords somewhere else. If you don't know what I'm talking about, please reassess the size and quality of your vocabulary and try not to sound like a meme-spewing teenager next time.

Name: Anonymous 2015-12-05 14:18

>>81
what the fuck are you babbling about you insane retard

Name: Anonymous 2015-12-05 14:18

>>81
Fuck all the way off.

Name: Anonymous 2015-12-05 21:03

>>80
You live in a world where 99.99% of battery-powered devices are executing Objective-C, Java, or Javascript.
I am excluding handheld game consoles because you are clearly not developing for any.

Name: Anonymous 2015-12-05 21:28

>>84
You live in a world where 99.99% of battery-powered devices are executing Objective-C, Java, or Javascript.
Which is obviously a bad thing.

Name: Anonymous 2015-12-05 21:41

>>84
You live in a world with niggers and sandniggers. Doesn't mean I'll hug niggers for a living.

Name: Anonymous 2015-12-05 22:21

>>82,83
LLLLLLLEEEEEEEEEEEELLLLLLLLLL E/G/IN /B/AIT MEME /G/RO, or at least that's what I think >>81-kun (aka lel-kunt) is trying to say.

Name: Anonymous 2015-12-05 22:35

>>85
Yes, but that also means the battery savings from such a minor optimisation are negligible compared to the rest of the shit being run.

Name: Anonymous 2015-12-05 23:38

I fucking means that those environments are already optimized for battery use and anything you fucklenuggets try to hack doesn't make a pissing shit of a difference.

The major power drains are the covered by the OS/driver/JIT.

Name: Anonymous 2015-12-05 23:40

(and I'm drunk, so spelling/grammar nazifags can fuck right the fuck off. all the way)

Name: Anonymous 2015-12-05 23:55

Don't justify your idiocy with that LOL I'M SO WASTEEED LMAOOO XD I'M SUCH A COOL BOY XDD act. These devices are severely constrained in what they are able to do and careless apping is why a 1GHz dual-core ARM CPU is barely able to put up with displaying pictures of your balding face without lagging like the piece of shit it is. It's the same reason your battery rarely lasts more than 30 hours.

Apping like a currynigger because of ``time constraints'' is just as bad as Cudder's bitfucking.

Name: Anonymous 2015-12-06 0:01

>>91
I'm just talking about spelling, not content, you piece of dick.

Sure, you can fuck your battery life. But any decisions about the algorithmic workload of your APP, like how many passes you run over your data or whatever, are best handled in a fucking high fucking level fucking language, not by dicking around with bytes and SEPPLES.

Name: Anonymous 2015-12-06 7:31

>>77

In order words, the statement "GC is slow" (or "makes it run slower than it needs to" and all its variations) is incorrect.

Only when there are no faster alternatives, which is never the case unless you are unwilling to consider faster alternatives.

It's not something the programmer spends time doing, so it's moot.
It leads to a suboptimal program. It's crap.

And because it can be as fast or faster at runtime than manual management,
There's more ways to manage memory than malloc and free. Like memory pools. You can allocate and free memory in constant time and a few assembly instructions with some additional assumptions.

and is definitely faster for development time,
Not if you have meta-linguistic abstraction to do this work for you at compile time.

you're a fucking pathetic idiot for discarding it.
Nope, a rational being.

You're wasting your goddamn time.
No, I write programs to do this for me. The thing is I try to get things right at compile time rather than use some resource intensive overkill monstrosity that figures obvious shit out at runtime.

Your life is useless, because you throw it at shit like this.
The value of my life is independent of these pursuits, and your judgement of it has no bearing for me.

People who are way smarter than you have already solved these problems, but you'll dick around with slower shit, spending months fiddling with bytes and address lines, instead of taking advantage of stuff that's better.
Their solutions are utter shit. And trying to optimize their shit is too fucking hard. I'm starting with bit fiddling and finding the right set of abstractions that will trivially lead to optimal machine code.

Again, you're a child playing with toys. Fuck off, because you have nothing mature to say.
No, the toys are python, ruby, javascript, java, and haskell. And I refuse to waste any more time playing with them.

Name: Anonymous 2015-12-06 8:31

GC may be faster than manual management, but it never consumes less memory, and in fact needs 4-5 times as much memory just to achieve the same speed.

Name: Anonymous 2015-12-06 11:08

>>93 translated:

I am cudder

Lol, look at me move bits!

No, j00r progarm is slower becuz hooman do better than masheen!

I am cudder!

Name: Anonymous 2015-12-06 11:57

>>93
I write programs to do this for me.
While the rest of us use the programs that already exist to do this for us, and get on with our lives.

Name: Cudder !cXCudderUE 2015-12-06 13:57

>>78
or ignorance about the potential of computingdesire of HW manufacturers to sell product
FTFY

This "GC isn't slow" bullshit really feels like a conspiracy. It's not slow to you because you don't know any better and are accustomed to programs taking more resources than they should. You just haven't seen the true small and fast.

>>94
That's just a roundabout way of saying (like most other pro-GC articles) that the less you have to GC, the faster it is. Manual management is fast if you never have to free() either. Duh.

Many years ago a friend went to one of the many Java advocacy events Sun (this was really long ago) ran. I remember this clearly because he told me one of the funniest things was seeing, in one of the presentations about how GC is "low overhead", the extreme lag of the presenter's laptop --- which was running their latest idea at the time, a full Java-based OS. No wonder that didn't catch on...

Name: Anonymous 2015-12-06 14:19

>>97
fuck off namefag

Name: Anonymous 2015-12-06 21:24

>>97
I'm >>78, and I've done everything from bit-banging, cycle-counting, hex-entered machine code 35 years ago to distributed inference compilers today.

I have seen true small and fast. I have written true small and fast.

The circumstances of small and fast do not hold the same on modern CPU architectures, with modern connectivity, with supporting modern hardware compatibility, with support of modern file format & encoding variety, and with modern flexible behavioral expectations.

That's just a roundabout way of saying (like most other pro-GC articles) that the less you have to GC, the faster it is.
I think you need to read >>94 again. It has nothing to do with GC frequency.

Many years ago, Java...

Yeah, it was pretty bad back then. However, "then" is also not "now," not by a long shot.

Compiler & runtime environment tech has come a long way in the last years, especially due to the fact that regular machines now have the elbow room to analyze the code environment FAR better. The space and speed afforded to us in modern machines are well-used in making the resulting code faster and smaller. That is appropriate use of resources.

Name: Anonymous 2015-12-06 22:23

>>97
This "GC isn't slow" bullshit really feels like a conspiracy.
It's not slow when all you use is turdware that spends more time swapping than running.
A bigger problem is programs using so much memory in the first place that GC vs manual or sometimes even compiled vs interpreted is dwarfed by swap disk speed.

Name: Anonymous 2015-12-06 22:28

>>98
Fuck off back to /g/ where ``namefags'' are an actual problem.

Name: Anonymous 2015-12-07 3:36

>>97
An optimal GC will be as fast as manual memory management. Also, free() is basically free; it's just dropping a entry in a couple tables.

Name: Cudder !cXCudderUE 2015-12-07 6:47

An optimal GC will be as fast as manual memory management.
The problem is that "optimal GC" doesn't exist, just like the optimal cache replacement algorithm, or the optimally optimising compiler, or world peace for that matter...

Look at it another way: if optimal GC did exist, it would have to know enough about the program and its inputs and outputs that it would probably have enough intelligence to question its existence and the meaning of life.

Name: Anonymous 2015-12-07 8:43

>>96
I'm actually trying to advance the human race by making something better here, as opposed to building on top of a foundation of crap.

>>99
The circumstances of small and fast do not hold the same on modern CPU architectures, with modern connectivity, with supporting modern hardware compatibility, with support of modern file format & encoding variety, and with modern flexible behavioral expectations
That was ENTERPRISE Quality!

Compiler & runtime environment tech has come a long way in the last years, especially due to the fact that regular machines now have the elbow room to analyze the code environment FAR better. The space and speed afforded to us in modern machines are well-used in making the resulting code faster and smaller. That is appropriate use of resources.
Something is wrong if it takes a super computer to do code analysis. Design the language so code analysis is easy. If certain methods of evaluation can be done using special purpose instructions, let the programmer emphasize this so the computer isn't responsible with coming up with this on its own.

Name: Anonymous 2015-12-07 8:45

>>103
Optimal compilers work best for high level languages like Haskell.

Name: Anonymous 2015-12-07 13:55

>>104
So you don't want high-functioning software, and you don't want the computer to be smart enough to manage its own affairs.

Everything you say always points back that you want to be a slave to the machine, enjoying your waste of time in manually tracking bits and cycles.

You're nuts, and you're a blight on computing.

Name: Anonymous 2015-12-07 21:45

>>104
This motherfucker thinks the entirety of the past 50 years in computer science is ``a foundation of crap'' and that he can do better !!!

True FrozenVoid ,,infinite compression'' levels of delusion.

Name: Anonymous 2015-12-07 23:21

>>106
So you don't want high-functioning software, and you don't want the computer to be smart enough to manage its own affairs.
I want a better compiler and that starts with a better design for a language. A design that it itself can evolve to your needs. I want more control over the process because I know I can do better than some shitty Haskell implementation.

Everything you say always points back that you want to be a slave to the machine, enjoying your waste of time in manually tracking bits and cycles.
You are enslaved to shit software. You have no ability to improve the software you use. Go write another raycaster in python. Change the world with your program that depends on an interpreter that looks like a sophomore's programming project from 1980.

>>107
It's sad but it's true. And it's because of people like you, worshipping a mountain of crap and getting stuck in the mud, or should I say crap, instead of forging a new trail.

Name: Anonymous 2015-12-07 23:45

>>107
XML, Java, x86, Windows, JavaScript, PHP, Ruby, Lua, MongoDB, HTTP, 4chan and reddit were created in the past 50 years. Just because something is popular and has lots of momentum doesn't mean it's good.

Name: Anonymous 2015-12-08 4:31

>>107
Why do you believe that languages are exclusively "compiled or interpreted"? This feature is not inherent to a language. It is a property of the compiler's implementation.

Name: Anonymous 2015-12-08 6:07

>>110
While that may be true, it just happens that all implementations for the python language are shit toy interpreters that could have been written in 1980, and shit c translators that generate source files so large they may as well be interpreters. There will never be a more advanced ``compiler'' for that language because people are satisfied with every implementation being slow, overhead inducing shit. While a better compiler is possible, the people that are talented enough to produce one know their time is better spent on other languages and other problems.

Name: Cudder !cXCudderUE 2015-12-08 10:27

>>106
and you don't want the computer to be smart enough to manage its own affairs.
This "smart" shit has to stop. We'll just lose control of the things we build if we make them any more intelligent. "smart computers, stupid humans." It's already happening with these ultra-bloated ultra-abstracted systems that are so complex no single person really understands what they do. And the corporates are leveraging this immense complexity to spy and track us, while most people remain unaware because it's all been hidden under the guise of "smart".

that you want to be a slave to the machine
Ironic. If you make devices so "smart" they can manage themselves, they're going to start managing YOU. This is not a lame Soviet Russia joke. It's reality. How many people know about all the network traffic their smartphone/tablet/{insert your choice of locked-down user-hostile device} is making? Or what all the 100+ processes do? All they see is a shiny opaque box. They don't know what it's doing. Manufacturers don't want users to know. When it does something users don't want, they mostly shrug it off and keep going. That's the scary part. These (l)users are letting their devices control them, not the other way around.

How to fix this? Never let it do what you don't want. Make it do what you want. Don't passively let it do things that you should have control over. It's YOUR computer, YOUR choice, YOUR life.

Name: Anonymous 2015-12-08 12:38

>>112
Will to Power.

Name: Anonymous 2015-12-08 16:58

>>12
I hardly know what all the processed to on a regular Linux box do. Everything is either very complex or insufficiently developed. I know that it can be a lot simpler. But we are living in a world where worst-is-best won. It's depressing.

Name: Anonymous 2015-12-08 20:00

>>112
What's worse is that these idiots are actually driving cars that are connected to the Internet and have been explicitly proved to be hackable. Deaths don't get much stupider than getting smashed into a tree because some 13 year old Chinese kid got control of your car just to test his haxxor skills. Smart things indeed.

Name: Anonymous 2015-12-08 20:19

>>108
OK sure FV.

>>109
None of those are CS.

>>110
Whome are you quoting?

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List