Return Styles: Pseud0ch, Terminal, Valhalla, NES, Geocities, Blue Moon. Entire thread

Python.

Name: Anonymous 2014-06-29 19:50

What's bad about Python?

Name: This thread. 2014-06-29 19:55

What's bad about This thread?

Name: Anonymous 2014-06-29 22:41

Everyone still uses 2.X.

Name: FIOC 2014-06-30 4:44

THE FORCED INDENTATION OF THE CODE!

Name: Anonymous 2014-06-30 9:54

significant indentation is bad. confusing variable assignment and variable declaration is also bad.

Name: TPOBCI 2014-06-30 12:43

The Pleasure Of Being Cummed Inside.

Name: Anonymous 2014-06-30 12:49

What's bad about The Pleasure Of Being Cummed Inside?

Name: Anonymous 2014-06-30 17:10

>>1
- Scope model is a giant hack/kludge, not really lexical, not really dynamic either
- Method lookup is defined to happen at runtime to support monkeypatching, and even basic things like math operators go through layers of method lookup indirection
- BDFL is as conservative as the Go developers in his understanding of how programmers are able to write and maintain code, so he rejects anything that does "invisible work" if he didn't already know about it when he was 25 years old
- He also has a "worse is better" approach (keep the implementation simple at all costs) which prevents improvements to the design of the standard interpreter or language features that would allow for better optimization techniques
- Even when a totally different team gets involved and makes something like PyPy, it's held back by the semantics of the C interface in the standard impl, so nobody can use it in any program that relies on the C FFI in any way

Python is not a bad language in theory, but it's a millstone for real work

Name: Anonymous 2014-06-30 17:18

>>4
That's actually the best part, it allows us to get rid of all the curvy braces and semi-colons.

Name: Anonymous 2014-06-30 17:27

>>9
But both are better

and semi-colons
This is actually false

Name: Anonymous 2014-06-30 17:31

>>9
You can do that without FIOC. See Lisp, Scheme, OCaml, etc. These languages also feature multi-line lambdas.

Name: Anonymous 2014-06-30 17:38

N. 1 reason: BDFL is a kike

Name: Anonymous 2014-06-30 17:48

>>11
THE FORCED PARENTHESATION OF THE CODE!

Name: Anonymous 2014-06-30 17:49

>>11
lisp and scheme have ()

Name: Anonymous 2014-06-30 17:49

Python is not as good as it is made out to be, in other words it suffers from a degree of hype. I'll try to argue this point. Potential detractors of the language usually lack the experience to criticize it authoratively. This is more true for Python as it is not (yet) common that people are coerced (by work, school) into learning and working with the language. So the detractors are few and drowned out by the vocal supporters.

The proponents of Python cite 'indentation' as the worst problem, this is a strawman argument 'this is the worst problem and its not really a problem'. This argument has been voted up presumably by people who like the language, because it is certainly not a good reason not to use Python. I am far from an expert at Python, but I have done a couple of semi-serious projects in the language and will try to recall specifically what I didn't like.
- Broken scoping: Python, like many other scripting languages, does not require variables to be declared, as let (x 123) in Lisp or int x = 123 in C/C++. This means that Python can't even detect a trivial typo - it will produce a program, which will continue working for hours until it reaches the typo - THEN go boom and you lost all unsaved data. Local and global scopes are unintuitive. Having variables leak after a for-loop is confusing. Worse, binding of loop indices can be very confusing; e.g. "for a in list: result.append(lambda: fcn(a))" probably won't do what you think it would. Why nonlocal/global/auto-local scope nonsense? Such shortsighted scoping design leads to Python's faulty package system, which exposes everything, so that, for example, typing time.sleep=4 instead of time.sleep(4) would break the system-wide sleep function, while accidentally assigning some method to time.sleep, and you won't even give a runtime error - just a hard to trace bug. Moreover, dynamic scoping impedes encapsulation and compilation, so everything you write will be open source: no FASLs, DLLs or EXEs, while developer may want to have control over the level of access to prevent exposure of internal implementation, as it may contain proprietary code or because strict interface/implementation decomposition is required.
- Inconvenient syntax: FIOC or Forced Indentation of Code (aka "off-side" rule) impedes using CLI, automatically generating Python code and moving around large code blocks. Editing Python code requires special editors (forget about Word/Notepad), that expand tabs into spaces, while transmitting python code through a web post or email breaks indentation. Absence of block-terminator is so utterly confusing, that you'll find yourself ending blocks with #endif anyway. It's painful to deal with other things that need indenting, such as large SQL queries, or HTML when you're using things like mod_python. And why is there ":" when code almost always has a newline after it? Python's whitespace indentation indulges messy horizontal code (> 80 chars per line), where in Lisp one would use "let" to break computaion into manageable pieces: get used to stuff like self.convertId([(name, uidutil.getId(obj)) for name, obj in container.items() if IContainer.isInstance(obj)]), while logical connectives being lost in a pile of other symbols, like "and" in "if y > 0 or new_width > width and new_height > height or x < 0". Instead of usual "fold" and "map" functions, Python uses "set comprehension" syntax, which has a large collection of underlying linguistic and notational conventions, each with it's own variable binding semantics: good luck discerning [f(z) for y in x for z in gen(y) if pred(z)] from [f(z) if pred(z) for z in gen(y) for y in x]. In addition, you will enjoy cryptic expressions like z(*z(*m.i())[::-1]).
- Crippled support for functional programming. Python's lambda is limited to a single expression and doesn't allow conditionals. Python makes a distinction between expressions and statements, and does not automatically return the last expressions, thus crippling lambdas even more. Assignments are not expressions. Most useful high-order functions were deprecated in Python 3.0 and have to be imported from functools. Creating an object just to call a function, like ''.join(map(str, r)), is just annoying sign of bad language design. No continuations or even tail call optimization: "I don't like reading code that was written by someone trying to use tail recursion." --Guido
- Inconsistent type system: no Numerical Tower, meaning 1/2 would produce 0, instead of 0.5, leading to subtle and dangerous errors. Arithmetics on strings is surprising at best: "2" * 3 is "222", "2"+"3" is "23", while sum(["2", "3"]), "2" * "3" and "2"+3 are type errors. Multiple false values: "", [], (), False, 0, 0.0 - all auto-coerce to false, but 0==False and 1==True, while ""!=False and ()!=False; worser, True/False auto-coerce to integers, so False/True==0 and ['no', 'yes'][False] won't give an error, although ['no', 'yes'][0.0] does give an error. Why have both dictionaries and objects? Why have both types and duck-typing? The Python language reference devotes a whole sub-chapter to "Emulating container types", "Emulating callable Objects", "Emulating numeric types", "Emulating sequences" etc. -- only because arrays, sequences etc. are "special" in Python. Subtle data types (list and tuple, bytes and bytearray) will make you wonder "Do I need the mutable type here?", while Clojure and Haskell manage to do with only immutable data.
- Python has too many confusing non-orthogonal features: references can't be used as hash keys; expressions in default arguments are calculated when the function is defined, not when it’s called. Patterns and anti-patterns are signs of deficiencies inherent in the language. In Python, concatenating strings in a loop is considered an anti-pattern merely because the popular implementation is incapable of producing good code in such a case. The intractability or impossibility of static analysis in Python makes such optimizations difficult or impossible. Quirky triple-quoted strings seem like a syntax-decision from a David Lynch movie, and double-underscores, like __init__, seem appropriate in C, but not in a language that provides list comprehensions. There are better ways to mark certain features as internal or special than just calling it __feature__. self everywhere shows that OOP was bolted on. Poor UTF support and unicode string handling is somewhat awkward.
- Poor performance: Python's GC uses naive reference counting, which is slow and doesn't handle circular references, meaning you have to expect subtle memory leaks and can't easily use arbitrary graphs as your data. In effect Python complicates even simple tasks, like keeping directory tree with symlinks. Global Interpreter Lock (GIL) is a significant barrier to concurrency. Due to signaling with a CPU-bound thread, it can cause a slowdown even on single processor. Reason for employing GIL in Python is to easy the integration of C/C++ libraries. Additionally, CPython interpreter code is not thread-safe, so the only way other threads can do useful work is if they are in some C/C++ routine, which must be thread-safe.
- Python has no well-defined and stable API, making life easier for Guido and tough on everybody else - that's the real cause of Python's "version hell". Python has inherited the installation mentality idea that libraries should be installed, so it in fact is designed to work inside unix package management, which basically contains a fair amount of baggage (library version issues) and reduced portability. Of course it must be possible to package libraries with your application, but its not conventional and can be hard to deploy as a desktop app due to cross platform issues, language version, etc. Open Source projects generally don't care about Windows, most open source developers use Linux because "Windows sucks". Python third-party library licensing is overly complex: licenses like MIT allow you to create derived works as long as you maintain attrubution; GNU GPL, or other 'viral' licenses don't allow derived works without inheriting the same license. To inherit the benefits of an open source culture you also inherit the complexities of the licensing hell.
- No outstanding feature, that makes the language, like the brevity of APL or macros of Lisp. Python doesn’t really give us anything that wasn’t there long ago in Lisp and Smalltalk. "This is a hobby project and I neede my time for other things…" --Guido van Rossum (http://1997.webhistory.org/www.lists/www-talk.1993q1/0060.html)

Name: Anonymous 2014-06-30 18:32

Too much syntax and a badly organized standard library.

Name: Anonymous 2014-06-30 18:51

>>13,14
Python has parens too, and you can't do without them. I think parens are the most common syntax element. Adjusting to reading sexprs doesn't take very long. You can do it in an afternoon by writing code for a few hours.

I've gotten annoyed with the C-derived syntaxes because they look butt-ugly when you format them like sexprs even though they have a tree structure. Seeing some kind of }])]}}) at the end of a line looks truly awful. It's really common to stack these on the same line in Javascript or really any C-syntax with lamdbas/closures. Gross.

>>17
Nobody is going to read all that.

Name: Anonymous 2014-06-30 18:51

>>17
I did

Name: Anonymous 2014-06-30 19:00

>>17
Nobody has ever read all that in the last years it has been repeatedly posted.

Name: >>17 2014-06-30 19:38

Quote failure, I meant >>15 not >>17. You guys seem to have figured it out.

>>18
Why? It's full of BS as far as I can tell. I'm no fan of Python but I've skimmed it and it seems to be flamebait.

Name: Anonymous 2014-06-30 19:46

>>20

It's full of BS as far as I can tell. I'm no fan of Python but I've skimmed it and it seems to be flamebait.
can you elaborate on that?

Name: Anonymous 2014-06-30 22:39

>>20
There are some lame criticism and personal preference issues, but no factual errors that I can find.

Name: Anonymous 2014-06-30 22:51

>>22

can you elaborate on that?

Name: Anonymous 2014-06-30 23:37

>>20
I mean I read >>17
I remember reading >>15 in the past, 1/3 are wrong and the rest are right as far as I remember but maybe I am wrong

Name: Anonymous 2014-07-01 6:18

If not Python than what?

Name: Anonymous 2014-07-01 9:13

>>25
Go, usually. Lua for things where you actually have some reason for needing an interpreter at runtime. Real languages for everything else.

Name: Anonymous 2014-07-01 9:50

>>26
Go, usually
U MENA C

Name: Anonymous 2014-07-01 10:02

>>26
Lua is fucking shit. Not even FIOC is as bad as that Javashit clone made by a brazilian kike.

Name: Anonymous 2014-07-01 11:09

>>28

At least Lua doesn't use offside rule and design is simple. Otherwise Lua is pretty similar to Python.

Name: Anonymous 2014-07-01 11:18

>>29
Forced flushing of cache, indices start at 1, everything is a shitty table. Looks pretty shitty to me.

Name: Anonymous 2014-07-01 11:20

>>30

As I said "Lua is pretty similar to Python."

Name: Anonymous 2014-07-01 14:01

>>27,28,30
Okay, lemme elaborate.

If you're considering Python, it's safe to assume it's not totally unsuited for the project you're working on, meaning (among other things) you're not doing anything where GC is a bad idea. You're probably choosing it for its community, documentation, readability, and/or rich library ecosystem. You'll likely do some exploratory programming to iterate on your design, and use a testing regimen to keep your program on spec and hunt down errors. And if the situation calls for it, you're prepared to rewrite parts of your code as native extensions once you've identified performance bottlenecks or memory consumption issues.

(I hope it's obvious that much of this applies to Lisp, particularly Clojure. However, if you're considering Python and you don't know what else you should be using, I doubt you'd take that as a serious suggestion.)

Anyway, for all its good points, Python is fatally flawed (for the reasons already discussed) and I don't see much of a future for it outside of education and system administration.

Now, I mentioned Google Go as the main alternative. Something like 70% of the time, Go is workable as a drop-in replacement for an entire Python+C codebase (modulo the obvious cost of rewriting all your shit). It's slightly more verbose, and it still has all the pain points of old world control flow (especially the error handling idioms) but there are no particular speedbumps to speak of in the transition from Python. It has fast compile times, good tooling, a rich standard library, and built-in data structures sufficient for writing a million pounds of glue. Basically, if you know Python is fucked up but you don't know how to choose between the alternatives, just pick up Go and everything will more or less sort itself out. Trust me.

There's a smallish class of uses for which Python is still better suited than Go, but not by design, and this is where Lua shines because it is designed for that problem space. If you need an extension language for your C application, or you need to be able to patch and reload your code without a compilation step, or you just don't know what you're actually doing with your libraries, then you will want a so-called dynamic language, but you won't want a slow, heavyweight interpreter that's hard to integrate into your existing C code. That's Lua. The warts, like indexing from 1, are pretty superficial, and you'll get used to them. In my opinion the best part is that you can replace the standard implementation with LuaJIT for speeds that Python can only dream of, and the C interop doesn't turn into a federal fucking issue in the process.

The caveat to using Lua is that you have to do some lifting at the beginning to get your C APIs exposed to its interpreter. If you really need to hit the ground running and start coding right away, obviously there's another popular alternative called Node.js. I don't think I need to explain to /prog/ why this is terrible, and I mention it only with the reluctance of an air traffic controller walking a couple of teenagers through landing a 747 after everyone else on board has vanished in the Rapture. Like, if that's where you end up, you've already got lots of other problems I can never help you with, and you just need to bite the bullet and fucking deal with them in whatever way you can.

As for writing everything in C, that qualifies as a "real language" insofar as there's not yet a viable alternative for writing that kind of low-level code. In some cases you can generate fast C from a Haskell program using certain DSLs, like Atom (https://hackage.haskell.org/package/atom), and this is way saner than writing C by hand and introducing bugs that destroy your hardware or whatever. In other cases, like if you're writing an AAA game, you'll be up to your eyeballs in so many weird performance hacks that whatever you use ends up as a glorified assembly language macro package and you're gonna die no matter what. There was also a long period of time when C was the only usable applications language for every major desktop OS, but all the vendors are moving away from that if they haven't already, so there's C# and Vala right now and Swift is coming soon. C++ will live alongside them for quite some time, perhaps longer than Python sticks around, thanks to Qt and LLVM and some other stuff.

If your problem domain involves a lot of non-interactive data crunching and you don't have your hands and feet nailed to the JVM like some tragic chump, then you're in the nirvana-like realm where you get to use Any Language, So Long As It's Fast Enough™. This is where D actually gets used, and where most of the Haskell action is going on, and where you can get away with doing your whole thing in Common Lisp because a single programmer can write the entire system and nobody will need it in six more months. This is also where people still use Perl, and are starting to play with Julia, and even do dumb stuff like put everything in a Foxpro database and use AutoHotkey macros to communicate between that and Excel. (I have seen this happen.)

And this kind of turned into a much longer rant than I intended so I'll just stop talking.

Name: Anonymous 2014-07-01 14:35

>>32

You're probably choosing it for its community, documentation, readability, and/or rich library ecosystem.
These are subjective categories and irrelevant to the good language design. But yes, Python's community is good, if you consider "good" being a drooling linux-tard or a google shill.

not doing anything where GC is a bad idea.
For most interactive applications GC is a bad thing, because there is no guarantee that your app wont stop for the next few minutes to collect the shit. Haters gonna hate of course, but that is no hate, just a distaste for a tool that hits my finger and get me lose my job.

Name: Anonymous 2014-07-01 14:38

Name: Anonymous 2014-07-01 14:46

>>33
your app wont stop for the next few minutes to collect the shit
GC never takes minutes in interactive applications, GC-hater. GC pauses matter ONLY on mission-critical embedded platforms, like a microwave.

Name: Anonymous 2014-07-01 15:00

>>35
GC never takes minutes in interactive applications
how do you calculate worst bound?

Name: Anonymous 2014-07-01 15:34

>>36
Empirically!

Name: Anonymous 2014-07-01 18:34

There's still no better language than common lisp. Use scheme for scripting.

Name: Anonymous 2014-07-01 18:35

>>37
Empiricism is popular among simple minded labrats.

Name: Anonymous 2014-07-01 18:37

>>39
And calculationism is popular among freshmen who don't know about the halting problem.

Newer Posts
Don't change these.
Name: Email:
Entire Thread Thread List