Are they really better than handmade parsers? The code they produce is fucking ugly, it's cluttered with cryptic variable names and you need to read entire books to understand how they really work. Yet everyone claims writing parsers is stupid because we've had parser generators for years and they will always produce much better results.
How true is this statement? Is it akin to the ``you don't need to write ASM because compilers are much better than you'' bullshit? I, for one, have never been in a situation where I need to write my own assembly, but that doesn't make GCC any less of a piece of shit.
Parser generators Are they really better than handmade parsers?
I think you mean: Generated parsers Are they really better than handmade parsers?
Name:
Anonymous2014-09-19 17:53
Generated parser generators Are they really better than handmade parser generators?
Name:
Anonymous2014-09-20 10:28
Yes, it abstracts the implementation from the specification. If you use a parser for something, another person can read your code and use a different parser. It will be more time-consuming for him to read your handmade parser and write a different handmade parser (or use a parser).
Parser generators usually produce bottom-up, while typical handwritten parser is top-down. Bottom-up should be more efficient on deciding what it got. It also has potential for multi-threading.
Name:
Anonymous2014-09-25 9:06
For something as simple as writing a Scheme interpreter, would you rather use Flex/Bison or make your own parser?
Having written gramer specs in lex/yacc (and lots and lots of other parser generators, from antlr to attoparsec) and hand-rolled state machines, I can attest that you must be absolutely sure that you need a handrolled machine.
I have stepped into this pile of shit more than once. Yes, handmade machines usually perform better and give you more control over error reporting and passing grammar attributes around. BUT they are a device from hell when you suddenly need to change something in your grammar a month later. Then you are in a world of shit. With parser generators, easy.
>>30 Two words: recursive descent. Simple and more than fast enough. There's a reason GCC/Clang both use RD (and GCC used to use lex/yacc, changing to the RD one actually made it go a little faster: http://gcc.gnu.org/wiki/New_C_Parser ). otcc/tcc uses RD.
Fun fact: I heard that the bash bug wouldn't have happened if they used RD, since then they would've been far less inclined to use the whole bloody command parser/evaluator to evaluate a simple function definition instead of the piece of the parser that only handled funcdefs - you can't arbitrarily pull out a piece of a lex/yacc parser to reuse.
Name:
Anonymous2014-09-30 14:57
>>31 Gramps, it's 2014! Forget your "perl" already.
Name:
Anonymous2014-09-30 15:05
>>34 Hey, it'll be finished before the end of the century!and hopefully before I die
Name:
Anonymous2014-09-30 17:24
>>34 But >>31-san is right, Perl 6 grammars are good.
Fun fact: Rakudo bootstraps in essentially one pass by updating the language core incrementally at run time to provide the language features needed to complete the compilation of the compiler and standard library, including the grammar support needed to parse much of the code that the initial grammar can't parse.
Name:
Anonymous2014-09-30 18:03
Too bad Perl is SLOW AS BALLS
Name:
Anonymous2014-09-30 19:08
>>37 It's not Ruby slow, it's not even Python slow.
Name:
Anonymous2014-09-30 20:01
everything higher-order perl wrote of can be done better in lisp.
Name:
Anonymous2014-09-30 20:03
>>39 That's not the issue. When Lisp gets Perl 6 grammars we can talk.