Hacker News new | past | comments | ask | show | jobs | submit login
[flagged] Why Python Is Terrible (josvisser.substack.com)
58 points by thijser 9 months ago | hide | past | favorite | 121 comments



I wish people could be honest and say they don't care for some language or framework or OS for personal or aesthetic reasons, rather than having to round it up to being objectively bad, but then I suppose nobody probably would click on "I don't like Python and have got some nits to pick".

Oh and he just says what is supposed to be quiet part at the end:

>And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living. I know it sounds harsh and I apologize for that, but writing software is a profession and hence you should have the knowledge, skill, and experience to use professional tools.

Hear that all data scientists, flask devs, systems engineers, and ML folks? Python is bad so you should quit. ;)


I see this sort of person as an extremely rigid, unbending, dime a dozen type person even if they’re very intelligent. I’m in this business to succeed and build and create things, there is very much an “energetic” aspect and his energy is dead as fuck. It’s a very simple fact that some see it, and some don’t. He’s one who doesn’t, and all those that see it can see it so clearly. He’s most definitely not someone I would want on my team.

The amount of idiotic … implications of his statement is so excruciating it’s physically painful to me. But I encounter this unadaptive unawareness all the time with working with programmers from other teams, etc so I’m used to it.


He published an article in June titled "It is not cool to be unkind!".

Interesting contradiction.

I know someone like that, very energetic, full of ideas, but so stubborn I'm sure it holds him back.


Lol bragr did not offer any ideas, only cranky complaints. Who cares?


Aha, you pointed out the disease.

This is the biggest problem in software and it's kind of intractable.

The ideal world has tools that empower everyone to do what they need to do, which to some extent must include an activity like programming.

But, and this may be unconscious, "people who program for a living" have a strong incentive to gatekeep.


What's amazing is people who program for a living, who people would normally think of as being "experts", have so little knowledge of all the different types of programming that is done by, well, people who program for a living, and the tools they use to do that programming.

Not only then is it gatekeeping, it's also a sign of an inexperienced programmer.

I'll give one thing to Python programmers, they tend to work directly with their end users and build solutions that matter to them. Anyone that's done that kind of work knows it's difficult, regardless of programming language being used.


And arguably, is the more "important" work, or at least the stuff requiring the most attention?

I know that's a tough thing to say -- but yeah, the irony is that a really really good "low-level programmer" is destined for obscurity, because things just work and once they're working people forget about it. Shouts to Linus Torvalds.


You're saying that specialization is bad and that specialists can't be experts.


No. I'm saying people that make these ignorant statements aren't experts.


I see, thanks.


Clearly a Go enthusiast who never mastered Python. And, really, Go? If your were pushing Rust mayyyybe I could give you the benefit of the doubt.


> I wish people could be honest and say they don’t care for some language or framework or OS for personal or aesthetic reasons, rather than having to round it up to being objectively bad, but then I suppose nobody probably would click on “I don’t like Python and have got some nits to pick”.

Yeah, this kind of hyperbolic headline article repeating mostly dead-horse arguments is just low-effort click farming, not a serious contribution to the discussion of anything. All it adds to what is a well-trodden debate is…factual errors, like the claim that Python prominently features “lazy evaluation” (while, as in any language, you can arrange laziness in Python, it is very much eager normally.)


I really think that Python is not a good language for ML, it just got "there" first.

The ecosystem is the real plus, of course. But the language is a headache for this. I agree with the "false economy" angle. I would happily trade the "agility" of dynamic "gluing" with some kind of real type safety, human-readable error messages and performance[0].

[0] - hiding C in Python's clothes doesn't count :)


Python is the de facto glue language with one of the biggest ecosystems out there, that makes it possible to use any kind of over-the-top library that does 1093 things after a single `import antigravity`. Also, ML absolutely makes sense for python, it’s not like most PLs have actual support for video cards — ML is very specifically about manipulating data (a fundamentally dynamic task) and calling out to specific libraries for training, a very glue-task. Give me any language better than python for that.

Quoting Brooks (butchered): “the only significant productivity improvement comes from relying on code that is already written”. Your fancy “better” language has not even 1/10th of what python has, it won’t replace it.


> Python is the de facto glue language with one of the biggest ecosystems out there

I never contested that.

> ML is very specifically about manipulating data (a fundamentally dynamic task)

I disagree strongly with using dynamic languages for data. Data has dimensions, units, types. You need to know that you're not adding coats to horses or USD to EUR. You need to know that you didn't silently sliced by the wrong axis. You may want formal verifications. You may to transform data without worrying about silent errors.

All the "metadata" and wrapper classes ML in Python are just trying to give you what the language can't.

> a very glue-task. Give me any language better than python for that

That's my point exactly. ML has evolved beyond glueing a few C libraries. It needs complex, big programs, which is an area where Python is terrible. Also the different nature of the "glued" components (each with its own data formats, protocols and calling conventions) makes the glue a mish-mash of untyped mixed magic idioms.


Training and using ML is different. It has been relatively common to bundle the trained weights with a different programming language system - but training can be a more exploratory phase, so python is not a bad fit for that.


> And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living. I know it sounds harsh and I apologize for that, but writing software is a profession and hence you should have the knowledge, skill, and experience to use professional tools.

Part of the profession of software engineering is maintaining software that's already written. Should the people who maintain python code, not be paid for their work?

Another part is choosing the right tool for the job. Python has its flaws, but it is better than Go in some ways. For example, it has a richer ecosystem of libraries.


Let me ask the obvious question.

Why hasn’t the go community, of professional software engineers built an even richer ecosystem of libraries?

Is it ennui, incompetence, or attitude?

As Go came from Google, is that the attitude was, “I am a professional I’ll just write my own code to solve X”, rather than considering building a library that others can use?

Are libraries harder to build in Go? Is adoption of libraries by the Go community different?

Is it a mindset, that libraries are uninteresting?

Or is it something else entirely?


Compare the number of programmer-hours on Python to Go. Having a ~15 year head start helps a lot.

Plus, I've been programming in Go professionally for a while and it's been a while since I reached for a critical library and it was missing.

Go ends up needing fewer libraries anyhow; in Python you have the pure-Python version, no, wait, the Twisted version (which may not be current but are still all there, increasing the library count), no, wait, the async version, no, wait, the Python 2 version, no, wait, the one that binds to a C library.... and actually this isn't specific to Go, it's really more specific to Python. Library count gets bloated up over the decades by the fact that when, for example, Python went to async, all non-async libraries in which async was relevant suddenly needed a clone. Go has had effectively no language changes which create such parallel libraries (note changes, not additions; Go has had additions which create opportunities for more libraries but don't create lots of parallel libraries). I don't know that Python is uniquely changing, "best practices C++" is arguably up there, Javascript has had a lot of churn, but it's in the higher tier of such things. Languages that don't churn like that should be able to cover the same amount of "need" for libraries in fewer libraries by the numbers.


Yes. Go has a culture that avoids using libraries (the axiom is "a little copying is better than a little dependency" (see, e.g. https://www.efekarakus.com/2021/09/23/a-little-copying-is-be...)


Having used go professionally, I would comment it's not just a little copying, but can be rather substantial.


It hasn't been around as long as Python.


That is true, but where are the go libraries for machine learning, tensors, LLMs, equivalents to PyTorch that I see bandied about?

Others have stated that they don’t need such libraries in Go, or that Python just has a glut of libraries that are a sort of detritus obsolete or language version specific.

Why isn’t Go then, THE language of choice for machine learning, analytics, fast prototyping, data conversion, ETL, etc.

What I’ve heard of Rust in comparison is that the syntax of the language & the learning curve is counterintuitive for engagement by non-computer scientists or systems programmers.

Seems expected then that, non comp-sci folks will choose a language that is more accessible, with an ecosystem that lets them get things done in a short period of time.

I would wonder if it it that the experts & heavy users of Go or language architects moving Go forward see any need for a parallel course of engagement to bring Pythonists into the fold or to allow Go to become the language people choose over Python or TypeScript, et al, by providing a best of class approach to quickly prototyping and extending into well architected systems as an underlying function.


Why do people use Microsoft Windows on x86?

The computer industry has a bizarre reputation for moving fast and breaking things. In fact the industry is shockingly conservative. You will encounter many, many programmers who flat-out refuse to learn new things.


Most companies & universities aren’t exposed to cutting edge or even modern technology. They stay in their bubble.

Leaving Silicon Valley, I’ve found the engineering management culture can be averse to training and allowing active skill building via development projects.

Just code that shit in Java or C++, doesn’t matter if you haven’t been trained in SQL properly or are even aware of best practices.

That culture leads to some gruesome product implementations & upgrade scenarios.

Hell, our IT department blocks this very website from being accessed.

I access it from my personal devices.


What's the Go community? Google employees? There's no Go community, it's a corporate language.

Why hasn't the Visual Basic for Office community produced more libraries?


Not sure if rhetoric question only, but Go has the expressivity of C. Let’s not compare it to python that is often textbook pseudo code like.


If the “expressivity of C was such a wonderful thing why was Go even needed? Why was Java & J2EE promoted to enterprise development?

Might it be that coding in C was and is hard and further that memory management pointer issues and more led to unstable code?


I wrote my comment as a negative — C is terribly inexpressive, just as Go.


Terrible is using a dynamic programming language and expecting static features from it. Besides, linters and type hinting have come a long way.


>Besides, linters and type hinting have come a long way.

You acknowledge that the kinds of static analysis that are feasible in Python are valuable, but it's "terrible" to want the kinds of static analysis that are infeasible. How interesting that the two boundaries line up exactly.


If you want static typing, typically you want to avoid python.

If you want the freedom to easily mangle json or other dynamic types, you'd enjoy python.


They have, but I find they still lag behind the state of the art. Python insistence on optional, second thought support for type hints is frustrating.

Linting has saved my bacon more than once, granted.


Still incredibly horrible for many cases. And very slow compared to things that have been around for decades. Do some ocaml and you see how incredibly bad everything 'linted' is. You can compile millions of lines in seconds in ocaml, eh delphi on a 90s computer, jai (I don't find Blow sympathetic but he does point out how lame everything is and that's good) etc, but when my 50k python or typescript project starts linting when doing a yarn build, I have time for a good workout.

Ocaml is an interesting example imho, as the type inference engine is so good you hardly have to specify types. When you read it and squint, it looks dynamic. It's not.


There's an evaporative cooling effect. Ten years ago it was obvious that Python was going to have a very hard time in the multicore world. People who needed that performance and knew they needed that performance left somewhere in the intervening years. It has been obvious for a while that Python was not going to be capable of a general solution to that problem no matter what it did. Now those people are no longer in the Python community.

What's left are people who don't need that performance, which is sometimes me and is when I still am happy to use Python for something, and people who do need that performance, but don't know it. Those are the ones who get into trouble.

I do wish that the Python developer community would be more open about the fact that Python performance is really quite bad in many cases and that while there are some tools to peck around the edges, it is still fundamentally a low performance language, things like NumPy notwithstanding (it is, ultimately, still a "peck around the edges" tool, even if the particular edge it pecks it does extremely well, but that only makes the performance delta worse when you fall out of its accelerated code path). I feel like maybe back in the 200Xs the community as a whole was more open to that. Now "Python is slow" is perceived by the community as an attack. Maybe because the ones who did understand what that issue was are mostly gone.

But in the 2020s, yes, Python ought to be straight-up eliminated from many tasks people want to do today on straight-up performance grounds. Overcoming a ~40x disadvantage in the single core era was sometimes tough, but often doable. But nowadays basically that gets multiplied by each core on the system, and overcoming a multi-hundred-factor disadvantage is just not worth your time IF that is a level of performance you need. Python is nice, but not nice enough to pay for multiple-hundred-factor performance penalties. Static languages have come a long way since 1995.


If the only problem with Python was its slowness, I could live with it! Its other problems are worse (like tooling, bad type hinting, etc).


Agreed, but I think slowness is the one that is impossible to fix without making Python into something other than Python. Since the day it was born we were promised that Sufficient Smart Compilers would come along and make it as perfomant as C. We know what that looks like in the limit now, which is basically PyPy; it doesn't get to C except in small cases and eats a ton more memory in the process, and there's no reason to believe it will ever happen.

The semantics of Python are fundamentally slow. To fix that requires changing semantics. Such a change would dwarf 2 -> 3 in size and be effectively a new language, like "Perl 6" was.


"there are only two kinds of languages: the ones people complain about and the ones nobody uses".


Every single "This language is good" or "This language is bad" take really needs to always come with a "for what, exactly."

"This wrench is really bad for hammering nails!"


Agreed. But I also think it's fair to criticize general purpose languages on general grounds :)

It's just that this article isn't very good at it.


It's fair, but the other thing happens WAY more.

It honestly just strikes me as very odd how e.g. even "just use multiple languages" is done a whole lot, but not really talked about as a good idea (as much as MY LANGUAGE IS GOOD AND YOURS IS NOT)


The author of this article does highlight the cases where using Python didn't work out well for them, in fact multiple times.


The context the author was working in was a large corporation with a demanding collection of services that needed to be managed. Many of the tools to manage those servers were written in Python, since many people know it, it was already widely used at the org, it had good support for C++ interop (also widely used at the org), and Java was found to be really clunky for sysadmin tooling.

I've seen both sides of it- I worked in the same sub-org as Jos, and my very first job was cleaning up a large pile of incredibly important and variable-quality Python code used to manage a fleet of database servers. the code was tools to do useful things like implement failover of the replication master (across ~90-120 shards) from one region to another for maintainence. Or apply schema changes across all those master shards. Or monitor the shards at runtime.

At times, the code would Exception (literally, crash with a Python exception) during the middle of an important but routine maintainence, and the migration would be half-done. I was hired- literally, this was my job- to add tests to the code until it was much more reliable. Working on that convinced me that rather than type-safety (which is nice, and can be used optionally in python), high test quality and high test coverage of paths used in production were more important to keep the code running smoothly.

I just wish Python hadn't made strings a sequence type, as one of the most common errors at the org was accidentally e m a i l i n g e v e r y s i n g l e letter in a To: string. IE, if it was "To: bore-sre@stoogle.com", then b@, o@, r@, e@, etc, would all get an email saying "Process failed..." And r@ would reply (because he's rob pike) saying "your python program has a bug...."


Python is one of my least favorite languages and I avoid it wherever I can. I agree with several of the criticisms here, but I disagree with this part:

> The problem with Python is of course that it is an interpreted language with lazy evaluation

That isn't "the problem" with Python. There's nothing wrong with these sorts of languages. However, it does limit the sorts of problems the language is suited for, and I do see places where Python was used where another language would have produced far better results.

Perhaps using Python inappropriately leads to some thinking that the fault is with the language?


> with lazy evaluation

I know it has functions that are lazy, but it's not lazy as in a sense that Haskell is right? I never use it as I find it a ghastly horror show (my taste, other people like it, that's fine), but I had to use it a few times and found that (also from the article) some parts are lazy, but not python as a language. Is that not correct?

> it does limit the sorts of problems the language is suited for,

Interpreted (...) is the implementation, there is no reason why it should be interpreted.


You are right, the article is wrong. It is most certainly not lazy. f(g(), h()) will always call both h and g regardless of whether f uses their results.

Iteration in a sense can be "lazy", but that laziness is via data structures built on top of a strict core language. Python is not alone in having lazy iteratable data structures, but it leans into them relatively hard in its standard library.

Many of us love this, as it lets for loops work elegantly over all sorts of abstractions, but I could see some folks disliking it. Still, the author does not explain this nuance, which makes me think he does not quite know what he is talking about.


Almost every mainstream language has a way to "lazily" (in the sense you mean) iterate over lists, so this can't be what the author was singling out. I think he's just confused.


But that's just libraries though. A language that actually evaluates outer to inner is different. And yes, indeed confused I would say.


Almost surely the author of TFA meant "dynamic", not lazy.

Nobody uses "lazy" in the sense of TFA.


It’s just plain wrong. The Wikipedia link points to “lazy” in the Haskell sense. Python is not that.


Agreed. It's so evidently wrong my guess is that the author thinks "lazy" means "dynamic", searched Wikipedia and pasted a link to the article without reading it (in which case he would have found it didn't match his understanding).

It's the less insane explanation. The alternative, that the author read and understood what lazy evaluation is, but somehow still thinks Python does it this way, is too crazy to consider.


The article meant Dynamic instead of Lazy? Ah.. Well, that's terrible use of words that mean something then.


Yes and yes.

It doesn't bode well for a bit of criticism when it starts by grossly misusing a technical term.


Not Haskell style lazy but more of declarations are just another statement that aren't evaluated until those lines are executed. This means you can have function definitions inside of ifs which can be very useful for conditional programming / meta programming. Similarly, all variable accesses happen at runtime. This makes it so you can't truly statically verify the program but that the only way to know for sure how it will behave is to run it.

In an odd twist, function argument defaults are evaluated at function definition execution time rather than call-time (so half-way in laziness) so a `[]` as a default is shared between all users of that function and if you modify it you are changing a global. I heard that this led to a vulnerability within some security software.


What you describe is not lazy evaluation in any accepted sense of the term. I bet you it's also not what the author meant; he was probably thinking of dynamic typing.

Alternatively, maybe the author (and you) meant "interpreted language"? Lazy/strict evaluation is an orthogonal aspect.


Yes, pretty much agree with this word for word. It is very, very difficult to refactor a python application in any sort of reliable way. The standard way of error handling in python appears to be to present the user with a stack trace. Very user friendly (not!). Now people will say that, for instance, mypy can help with this. That is true but since projects can be started without type checking chances are that your project was started without type checking and that introducing mypy is somewhere on the backlog and when it comes off the backlog it will be enabled only partially because otherwise there will be too many errors and so on. It is such a garbage programming environment.


I couldn’t (and still can’t) believe that non-exceptional situations are considered exceptional. Such as, there not being a way to parse a string into a number and return a value indicating if it was successful or not. With Python everything is an exception. Messy and inelegant. Of course this mess is called “pythonic” so everything is fine…

https://blog.codinghorror.com/exception-driven-development/a...

https://stackoverflow.com/questions/2184935/performance-cost...


> parse a string into a number and return a value indicating if it was successful or not.

Eric Lippert had a decent blog post[1] back in 2008 on this titled "Vexing Exceptions" with regards to .NET.

[1]: https://ericlippert.com/2008/09/10/vexing-exceptions/


Whether or not you agree with it the argument for exception handling, rather than return values, is it lets the code more naturally flow for the successful case. The alternative is go's repetitive

    f, err := fn(...)
if err != nil { ... }

pattern. Which is better is really a matter of taste. To have error handling smattered all over the code seems more inelegant and messy to me.


What? You can return Exceptions instead of raising them. Since the language is dynamic, the caller can introspect.

Personally, from the perspective of a number parsing function, I think being passed an unparseable number counts as an "exceptional" situation- InvalidArgumentError- and I don't care which way it's returned- as an raised exception, or as an error object- as long as it's clearly documented, and the use matches the semantics of the function (NetworkNotAvailable is a better example of an exception where you're want to put something in the exception block)


> The standard way of error handling in python appears to be to present the user with a stack trace.

What do you expect it to do? Silently fail and move on?


No. How about something that is neither "silently fail" or "print a stack trace"? How about something that lets the programmer handle the error?


Yea python should introduce try except.


Python has proper error-handling since 20+ years.


A few weeks ago I was working on a small wildfire smoke and fire perimeter API, and I hit a few annoying snags due to tooling issues. I needed to process a lot of different formats of layered geographic datasets, and converting one thing to another, processing the data into various buckets, cleaning, aggregating, etc. all wound up being extremely cumbersome and verbose.

I write a lot of Go and I’m used to that. But when I hit snags in the less familiar territory of geographic data processing, it was a slog. Terrible documentation for the libraries was a major barrier, and otherwise it seemed as though essential features simply didn’t exist and I’d have to invent them.

I got the idea to explore Python for the project because people use it for data processing. I’ve used it in the past, though never for this. Whatever, I thought, at the very least I can validate that Go is a suitable tool.

Within a day I had rebuilt everything with Python. I built a flask app around the forecasting and fire perimeter tools, and had it deployed the same evening. It was mind blowing.

As an ecosystem I was absolutely blown away by Python. Do I like the language? Not really. I encountered so many cases where something could be so much faster and more efficient with Go. Deployment would be easier. I’d get more API from the same resources. Scaling would be ten times easier. Static typing tools kept blowing up my IDE because this library doesn’t support this, or the type tool is wrong about that. It was very janky at times.

Yet Python got it done. It’s live right now, and development is steady. Go was not steady and I didn’t see any solution to that in sight without reinventing countless wheels.


By the way, another nice thing about python is that you can force your program to drop into a debugger with an interactive shell at runtime- and inspect/print objects. In today's complex world of generic types, it can often be hard to see exactly which method is handling your data.


My favourite things about Python are the huge community and the rapid iteration. Don't got me wrong, I like Go, too. It was the primary language I worked in at my previous job. But sometimes you just need the big community and the huge pool of documentation, or you want something to work now instead of fiddling with it, and Python is great for that feedback loop.


My experience with Python can be summed up as: it's tempting to start something with it, since it has such low (initial) friction and hey, "this is just a small throwaway project anyway".

Months or years later, it's a beast, hard to understand or refactor, full of holes and pitfalls, and Python's terrible tooling doesn't help either.

And I never learn the lesson!


What tooling are you missing?


I've been down this road before on HN, so let's agree I don't find the existing tooling satisfying at all.

The linter is ok, and an occasional lifesaver (but it shouldn't even be needed! It requires extra work to catch problems that other languages catch "for free"). And it shouldn't be a separate tool. It's also cumbersome to use, silence what is not needed (way too noisy) and fine-tune it. Inline comments to enable/disable it for specific warnings look ugly, too. Python devs tend to suppress whatever bothers them instead of fixing it because it's not in their culture.

The type hinting checking is terrible. It's getting better, but it still misses obvious things and requires too much hand-holding. In my experience, average Python devs don't use it because they don't understand it, or don't find the ROI worthwhile. And because it's optional, they can just pretend it doesn't exist (or complain if you make it mandatory).

The mess that is dependency management has been discussed multiple times. In Python's defense, it's in "good" company with other messes from different languages. But Python's case seems particularly horrifying.

In general, with tooling, Python lives in a special hell where every blog and article will tell you "it's awful because you're doing it wrong, you should instead [use|avoid] pip, pyenv, pipenv, poetry, <my custom script>, <some deprecated tool that nobody else recommends>, <cutting edge tool that is incompatible with everything else>".


The author of the post seems like an evangelist for the Go programming language.

> And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living.


What's funny is that (in a different Python rant from another author), it was pointed out that Google was the heaviest pusher of Python in the early 2000s. It probably would have been Java (from Android and elsewhere) had it not been all the legal stuff with Oracle brewing. So, Python here, Python there, Python everywhere... then Google invented Go and Dart and other shiny new toys and began pushing them everywhere.


Go was written to replace C++ and Java at Google, not Python. But after it launched many Python developers in SRE switched to Go- for good reasons. Initially it was a bit of surprise to the Go creators.

Python (fronting C++ code) still plays a huge role at Google. I don't see that changing. Go has almost zero story for scientific computing.


> It probably would have been Java (from Android and elsewhere) had it not been all the legal stuff with Oracle brewing.

Oracle started buying Sun (and consequently Java) in 2009; the merger was completed in 2010. So I fail to see how that could have influenced Google's purported aversion to Java in the early 2000s, even as late as 2007 or 2008.


> And, not to put too fine a point on it, but if you can code Python but not Go (or another decent programming language), you probably have no business writing software for a living.

Come on man. There's being opionated, and then there's this.


I’m not a fan of Python, but this article is insufferable in tone. If Python doesn’t meet your needs, don’t use it. There’s plenty of oddities to gripe about in Python, but this article doesn’t attempt to learn something or make a larger point beyond “my favored approach is the only valid approach”. Sorry, but that’s not an opinion worth considering.


As much as I personally dislike python, I completely agree with this, but want to flip it. If pythong meets your needs, absolutely use it. Don't complain about how Visual BASIC doesn't work for you on your way to downloading whatever version of python is current this month.


No enforced static typing and no proper debugger make Python painful for large code bases. It’s good for scripts, prototyping, and gluing libraries together to make utilities, but if something expands beyond a single file I stop wanting to use Python. Convincing my employer of this is another matter and why I would rather avoid it completely.


  The problem with Python is of course that it is an interpreted language with lazy evaluation […]
Huh?


They probably mean dynamic.


It doesn't bode well when an article starts with such a gross misuse of a technical term.

Far from me to defend Python, but this rant didn't start well.


Came here to make that exact comment. But I didn't see this comment, so I said the same thing.


I have grown to like Python for small programs and scripts because of libraries like TensorFlow, PyTorch, LangChain, etc.

I agree with the author that there are better languages for large applications.


I have developed insanely complicated software with Python that works pretty well.

If you expect to find Java or C in Python you are looking at the wrong place.


And now, of course, it's deprecated because you wrote it in a version of Python that's no longer in support or used a library that hasn't been (or maybe won't be) ported to 3.11.


I did 2 to 3 migrations. It was fine. I would rather deal with these types of issues once in a while than work everyday with Java.


Yeah. I did 90kloc of python porting. Twice. Never working for a company that allows developers to use python again.

I'm glad you were able to port your scripts, but python is absolutely not appropriate for mission critical software.


> I once worked with a service in Python that forked worker processes to handle requests, ensuring that all cores could be used.

> > Unfortunately these workers ran out of memory quickly so we decided to have workers terminate themselves after a configurable number of requests had been handled so that Linux would do our memory management for us.

I once worked with a service in Python that was essentially an image server for a special biomedical image format. The decoder was a proprietary C library written by a vendor and, unsurprisingly, had tons of memory leak issues. This exact feature of Gunicorn [0] saved us months of going back and forth with the vendor to get the library to stop killing our servers.

Python has it's flaws, but so does anything touched by humans.

[0] https://docs.gunicorn.org/en/stable/settings.html#max-reques...


Really for all the complaints about JS/NPM/Electron it looks absolutely genius next to Python tooling and PyInstall.

It's extremely frustrating that you're forced into using it to access technology that doesn't even use Python really it's just the composing glue sticking the native C++ or GPGPU code blobs together.


> Using Python for a large application is like building a nuclear reactor using Lego bricks

I like this comparison. Anyway, it's interesting to note that it took the author "many years of experience running large applications written in Python" to come to his conclusions. The advantages of static typing and the disadvantages of dynamic or duck typing are well known since decades. The problem is less Python as a language, but the fly-by-night decisions to just use it for anything. To stick with the example: what prevents people from using "Lego bricks" (or a high-temperature proof version thereof) to build a reactor? Sound engineering decisions and, most importantly, safety regulations.


>forked worker processes to handle requests

File this one under "things that UNIX systems programmers think will work in principle but end up being massive black-holes of attempting to quiesce any non-trivial application in a way that results in a sensible post-fork state".


Python is a horrible language, but not for the reasons the author gives. Just because range() returns a generator doesn't mean the whole language is lazy. Several Lisps allow something like duck typing and they're not horrible. It is possible to reason about program behaviour in dynamic languages, but JavaScript certainly makes it hard.

Python is a horrible language because it is not a language. It is a family of languages that changes every October. Sure, 3.x doesn't introduce as many backwards-incompatible changes per version as 2.x did, but that's like saying World War I wasn't horrible because towards the end fewer people seemed to be dying every week.


I've been programming in Python for most of the past 10 years, and I've never experienced a regression in backwards-compatibility between minor releases. What problems have you had?


The syntax of the language changes every version in non-backwards-compatible ways.

If you have a couple scripts, sure, maybe you're not affected. But when you buy a company that shat out 90kloc of python and then all the employees quit, it's not a happy day.

And sure... I shouldn't have used those features. I get it. I'm the one who's bad because I'm calling your baby ugly. Even though I wasn't the one that originally wrote that code.

Though I did write some code that used package variables. And then the syntax for package variables changed, but that was an easy fix. And then the scope of package variables changed to be class variables, which is totally fine, but harder to find. Then the syntax changed again, but in a way that made it harder to fine. And then the debugger stopped working if you enabled async io for a few versions.

Python is for total amateur code fluffers.


Which syntactic features have changed in ways that aren't backwards compatible? I've had some minor headaches, don't get me wrong, but in each case those headaches are a result of interface changes to common objects, like Exception-types. Python has added some syntactic sugar between minor releases, sure, but never at the expense of backwards compatibility.


Package variables. Class variables. Syntax changed between 2.3, 2.5 and 2.7. And the semantics changed somewhere between 2.3 and 2.7.


Wow, didn't realize folks were still actively developing on Python 2.x. FWIW, the Python steering committee takes backwards-compatibility pretty seriously these days. Here's a recent mailing list discussion on it's decision to reject a popular PEP on the grounds that it would have broken Pydantic: https://mail.python.org/archives/list/python-dev@python.org/...


I used to love python. It made me productive.

Until it introduced the haphazard type system. Now I need to import types in every file, use IF to guard it in CI in every file, and use a powerful IDE to be able to use the benefits of typing.


> Now I need

You don't need to do anything, you can ignore all type hints

> use IF to guard it in CI in every file

Are you talking about "if TYPE_CHECKING:"?

Your other option is to put "from __future__ import annotations" at the top of the file, or wait for Python 3.13 when PEP 649 lands and type annotations become lazily evaluated.


typing brings me no happiness either, because it's a lot of work without being complete anyway. Annotations that are not checked vs actual behaviour at runtime can always, by the laws of programming, be subtly incorrect.

I still use python. The recently introduced match statement is a great addition, IMO.


Can someone explain this part to me, please? I don't follow what's going on.

> Python's use of reference counting defeated copy-on-write because even memory blocks holding variables that were read-only were actually written to in order to manipulate the reference counts, thereby blowing up the combined physical memory footprint of the workers. We solved this by smurfing the interpreter to use a magic reference count number for all variables that were created by the master process and inherited by the workers, and then not touching reference counts that had the magic value.

Thanks


You have a program that for whatever reason (the Python runtime in this case) only works single-threaded, although its workload could be easily parallelized (say, it’s a web server where requests are processed independently). An old established way to accomplish this is to start a “master” process which forks N “worker” processes, each of which can happily run single-threaded.

This would be a nonstarter if it required N+1 times the memory of the single process, so the OS uses an optimization called copy-on-write. When a process forks, all its physical memory is shared by the new process so it takes almost no new memory to start. If the new process writes to a memory page, that physical page is copied so it has its own version. (Thus “copy on write”.)

For most programs this works fine, but if you have a runtime that does garbage collection using a technique that requires writing to an object even if the code doesn’t change any of its values, trouble ensues. With reference counting, you have to write a new reference count for an object anytime a pointer to the object is assigned. If you store the reference count in the object, that means its physical page has to be copied. So now the CoW optimization totally doesn’t work, because just referencing an object causes it to take up additional new memory.

Ruby used to have this same problem, and after Ruby webservers became popular (hello Rails) they eventually incorporated a patch to move the GC information somewhere outside the actual object heap. Other systems like the JVM use similar techniques to store the bookkeeping bits somewhere other than the object field bits.

So what the OP did is patch the runtime so the objects created in the master process (pre-forking) have special reference counts that are never altered. This mostly works, because the master process generally does a bunch of setup so its objects were mostly not going to be garbage anyway.


Thank you, this is a great explanation - much appreciated


I don't understand the "smurfing" solution he references, but CPython's runtime uses reference counts in each referenced value to detect garbage (when a value can be freed), which means even read-only values can be modified in memory by the runtime as object references come and go.

Those modifications force pages which were created on forking a child process as copy-on-write (meaning they share the same physical page until the page is modified by the child) to be copied and thus blow out any memory savings that would normally happen with copy-on-write.


It's almost like curly braces make for better languages :-P

My problem with python is its package system, and the mess around the fact it was designed to be global. (I have a similar gripe with Ruby).


> because the value of a good programming language is that it will not allow you to write programs that are structurally deficient.

Ummm... okay.

I'm not going to cheerlead for Python here (in fact I do not like it at all and also avoid it whenever possible) but many of this author's points seem to boil down down to "screwdrivers are bad, here's why you should always use hammers instead".

Different tools exist for different purposes.


Python is a gravity pool attracting unexperienced programmers. And very often (in my experience) , it shows.

Lack of static typing is nothing in comparison with lack of common sense and unwillingness to learn ("why force oneself ? The job market swallows everything anyway") .


Well said! Another way of putting it: Python isn't a production-ready language, due to the way people are using it.

Whenever some project I find doesn't just work, or works and then a few weeks later stops working, it always seems to be python. People cannot semver properly (even python itself? or is that some programs work on 3.9 and not 3.10 the programmers fault again?) and also cannot lock their dependency versions properly. Same problem that can happen to nodejs code, and yet, I rarely see such basic failures there.

I also just don't understand why anyone would even want to use python, anyway. I've tried to debug python code, or write new python code, but I could never get into it, nor was it easy to read code others had written. Whitespace sensitivity? Special __ files for package purposes? No JIT (no idea why pypy is a separate thing rather than rolled in)? I just don't see the advantage over JS which seems to do everything better. It even has the benefit of being syntactically intuitive for users of other languages, and typescript really has saved JS from chaos. It's fine to be different from C syntactically, but it I don't see the benefit of python to justify that syntax taking up another spot in my brain I could use for learning rust or something.


> Lack of static typing is nothing in comparison with lack of common sense and unwillingness to learn

I have never found any language community to lack that, and if anything its more where people have lots of experience exclusively in one language than with inexperienced programmers picking up their first (who tend to, by nature, have a willingness to learn, even if they have a lack of the common base of experience that gets labelled “common sense”.)


Everything is terrible if you use it long enough. Some things are more terrible than others for certain use cases - a thoughtful developer understands the weak points of the tooling, and selects the proper tool for the job at hand.


Interesting timing given the massive increase in Python for ML and data science applications lately.

Pytorch is a great library too. It is hard to imagine Python decreasing in usage any time soon.


Wow, saying python programmers are unprofessional.... that's amazing. Just completely and totally out of touch with the real world.

I'm glad I never reported to him while at Google.


Problem with python, it introduced a whole bunch of people into the programming world who have the minimum idea about programming or computer (mostly data scientists), just like how computer science introduced a whole bunch of people into “engineering” world even though they know little about computers at a low level, let alone at an OS level, that’s why several SaaS like heroku tried to close that gap because those “engineers” can’t into an OS!


It's the same old, tired rant about how Python's terrible.

The post is basically a surgeon complaining that a chef's knife is terrible for surgery; or how the F-16 can't loiter in the battlefield and deliver strafing fire; or how carbon fibre tubes don't handle compressive strength under water; or using C to do string manipulation and complex regex.

You're using the wrong tool for the job.


I thought it was just the whitespace.


This {{insert_programming_tool_here}} has not worked well for me and the projects I’ve worked on so nobody should use it. And if you do use it you are not a real programmer and you should be ashamed. Because only someone who uses {{isnert_programming_tools_i_like}} can call themselves real programmers. The recipe of these articles.


> Of course Go is not perfect (hint: no programming language is)

This is the only relevant statement.


Python was already the most popular language for carbon-based intelligence, but now it's also becoming the one and only language for silicon-based intelligences.

The future is artificial intelligence programming Python and human programmers writing blog posts about how terrible Python is.


My problem with python is that it's branded as cross-platform when at the end you are required to learn docker and run on a linux environment to really stop suffering.


How does running a Python 3.4 app in docker help? 3.4 is deprecated and will get no more security patches. Running it in docker doesn't change this.


It is 2023 and we have Ruff and Pyright.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: