Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why did Python win?
578 points by MatthiasPortzel 8 months ago | hide | past | favorite | 828 comments
I started programming in ~2013 in JavaScript. I’ve since learned and tried a handful of languages, including Python, but JavaScript was always my favorite. Just within the last year I learned Ruby, and I was blown away by how fun and easy to use it is. At the present time, I’m starting all my new projects in Ruby.

My impression is that in the ‘00s, Python and Ruby were both relatively new, dynamically typed, “English-like” languages. And for a while these languages had similar popularity.

Now Ruby is still very much alive; there are plenty of Rails jobs available and exciting things happening with Ruby itself. But Python has become a titan in the last ten years. It has continued to grow exponentially and Ruby has not.

I can guess as to why (Python’s math libraries, numpy and pandas make it appealing to academics; Python is simpler and possibly easier to learn; Rails was so popular that it was synonymous with Ruby) but I wasn’t paying attention at that time. So I’m interested in hearing from some of the older programmers about why Ruby has stalled out and Python has become possibly the most popular programming language (when, in my opinion, Ruby is the better language).




Python ended up 'specializing' in data contexts, thanks to Numpy / Pandas, and as a result, ended up becoming the first exposure to programming than anyone doing data stuff had. That was millions of people. In that space, it had no competitors.

Ruby ended up 'specializing' in web dev, because of Rails. But when Node and React came out, Ruby on Rails had to compete with Nodejs + React / MERN as a way of building a web app. Since people first learning programming to build a web app would usually start with javascript anyway (since a lot of very first projects might not even need a backend), it was a lot easier for the Nodejs/React route to become the default path. Whereas if you were a data scientist, you started on python, and as you got better, you basically just kept using python.


I think Python was popular as a general-purpose language first. After all, there was a reason people put so much effort into writing Numpy in the first place.

I think a lot of people were attracted to the language design, as captured in the Zen of Python (https://peps.python.org/pep-0020/), such as:

Explicit is better than implicit.

Readability counts.

Errors should never pass silently (unless explicitly silenced)

There should be one-- and preferably only one --obvious way to do it.

In many cases, Ruby has almost the opposite philosophy. There's nothing wrong with that - but I think a lot of people prefer Python's choices.


> There should be one-- and preferably only one --obvious way to do it.

This is so hilariously wrong in python though


So I imagine you have that perspective because you started less than 20 years ago. In some ways the idea of the Pythonic Way to do things evolved in opposition to Perl's vigorous advocacy of More Than One Way.

Python has been really winning for some time, so it's natural that its ideological discipline has grown ragged. The crop of kids who value options above consistency don't have the scars of the Perl age to inform their prejudices.

But Python is -dramatically- better focused, as a community, on finding a Pythonic way to proceed, and then advocating it, than previous cultures.


Back when I decided it was time to add a scripting language, Perl and Python seemed like the obvious choices, and in my mind were equally good options. I asked my best friend which I should choose, and he more or less said, "You can't go wrong with either one, but when you ask for help Perl people are assholes and Python people are nice."

I can't confirm his thoughts on Perl and I haven't interacted much with Ruby, but the Python community is definitely welcoming and patient in my experience. I wouldn't be surprised if this was a significant factor in Python's prevalence over Perl, Ruby, or anything else.


yep the Perl community kind of had issues around the turn of the millennium and the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

I don't think there was any toxicity in the Ruby community but it was made up of working programmers where as the big leading voices in the python community was teaching assistants and students so it might have been more tailored to newbies.

I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older so i think the real answer lies in why the educational sector decided that teaching python was easier and significant whitespace plays a huge part here.


> I don't recall there being much real industrial use of python prior to Ruby emerging even if python is technically older

Yeah that's my recollection too. About 2011ish there weren't a lot of jobs in python yet. Perhaps in SV, but not out in the real world. Several startups were doing it, including Youtube and google at the time.

But in the F500 world, python wasn't used at all. I started using it in 2008/9-ish.


> [...] the perl6 debacle did a lot to convince people that Perl was kind of a dead end.

Not GP, but the Python 2 vs 3 holy wars were also something that kept me from adopting Python as a scripting language a couple of years.


Yeah, python 2->3 transition was painful. But, I would argue that was self inflicted. Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

I mean, I love python, but that sucked!

Yes it would have been more work for the devs, but the amount of work it meant for the users were worse.

In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).


> Guido and company chose not to develop a 2.8/2.9/etc series where people could move their code base over incrementally.

That is literally what 2.7 was, as well as reimplementing some features in later p3 (up to 3.4).

The core team definitely had the wrong transition model at the start, and it took some time for the community to convince them then decide on which items were the most important, but let’s not act like they did not get it in the end.

> In fact they pretty much just threw away anything before python 3.6 anyway now. Many things introduced in the 3.x series before 3.6 just don't work anymore (asyncio syntax being the notable one).

What?


What would have been a better transition model? Are there any languages with major breaking changes that have done the upgrade smoothly?


> What would have been a better transition model?

Better supporting cross-version transition codebases.

The core team initially saw the transition as “run 2to3, fix what’s left, publish updates, P2 is gone”, but aside from 2to3 being quite limited such transition is quite hard for dependencies, as it means they leave all older dependents behind entirely (dependents which might be the primary user for e.g. company-sponsored projects), or they have to keep two different codebases in sync (which is hard), plus the limitations of pypi in terms of versions segregation.

What ended up happening instead was libraries would update to 2.7 then to cross-version codebases, this way their downstream could migrate at their leisure, a few years down the line people started dropping P2.

> Are there any languages with major breaking changes that have done the upgrade smoothly?

Some but usually statically typed languages e.g. elm’s upgrade tool worked pretty well as long as you didn’t have native modules and all your dependencies had been ported. I think the swift migrator ended up working pretty well after a while (Swift broke compatibility a lot “initially”) though I’ve less experience with that.

An alternative, again for statically typed languages more than dynamically typed ones, is to allow majorly different versions of the language to cohabit e.g. editions in Rust (Rust shares the stdlib between all editions but technically you could version the stdlib too).

Not workable for Python, not just because it doesn’t really have the tooling (it has some with the __future__ imports but nowhere near enough) but also because it changed runtime data model components specifically the entire string data model, which is not a small matter (and was by far the most difficult part of the transition, and why they piled on smaller breakages while at it really).


The only ruby person i've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

Of course I already knew Python, and so did the rest of my team so we had been doing tools in Python (the guy wasn't on my team), but until he pushed ruby into places where python would have been better (import a Python library rather than shell to out to a program) I was willing to accept it was probably fine '


> The only ruby person I've met was insistent that ruby was the one true way and he trued to force it into everything. That attitude turned me off.

I mean, if you read almost any Elixir article that has hit the front page of HN, there are always comments from Pythonistas saying, "Why bother when there's Python?" Similar attitude. Obviously it's not everyone, but it's not everyone in the Ruby community either.


It’s such a bizarre reason. “One person using something rubbed me the wrong way so I decided not to use it”. Did the person extrapolate to an entire community from a sample size of 1?


    The only ruby person i've met was insistent that ruby was the one true way
That sucks. I've been doing Ruby full-time since 2014 at 4 companies and I've never seen that sentiment, even from people who really love it. My experiences have been really positive.


I agree. I’ve found ruby and its developers to be pretty friendly and open to other languages and styles.


My experience with python was simply, people wanting to get shit done. This was circa 2008. They weren't really engaging in language wars, but doing innovative things like extending Java, with Jython.

I was arguing for the F500 company I was working on to explore using Jython to write unit tests for Java code.

Why not have a scripting language to write unit tests for Java code?

I see this with Rust trying to extend python in interesting ways. I don't see this with Java trying to extend C/C++ or Python.


Python and Ruby have some things where the intuitions are exactly inverted from one another. It took me a long time to figure out why Python rubbed me the wrong, and that if I dig up how I used to structure code in Pascal, it’s fine.

Not that I care much these days since I prefer writing in Elixir.


> I haven't interacted much with Ruby

“Matz is nice and so we are nice” https://en.wiktionary.org/wiki/MINASWAN :)

The Rails community is another story, unfortunately.


That’s funny because that’s one of the reasons I tend to point beginners to R instead of Python for data work.


I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.


> I can't imagine Python's welcoming community has anything to do with it. If anything it was Ruby that had a reputation for being the most welcoming community with its MINASWAN (cringe) philosophy.

TBH, community had nothing to do with Python's enormous success over its competitors (Perl, and Ruby. Possibly Tcl too.).

Nor did any technical merit, nor ergonomics.

There's one, and only one, reason why Python exploded at the expense of the other competitors: The ease and acceptance of using the language as glue for C functions.

Python's popularity is built on a solid foundation compatibility with C.

If, in the 90s, C++ had taken off enough to displace C, I doubt Python would be as popular as it is. Python owes its ubiquity to C, because if C was not ubiquitous, Python wouldn't be either.

(It's only recently, like the last 10 year or so, that I started seeing popular Python programs which didn't have a dependency on C libraries. And even now, it's still rare to see).


I don't think your analysis is accurate.

My experience of trying to get my own C functions to use in Python to have been nightmarish. Yes, you can do it... if you have exactly the same compiler & version used to produce the python interpreter itself.

C's only usefulness to Python is: it allows optimization of the 80/20 or 90/10 rule, so performance doesn't have to totally suck with Python.

Python 'won' IMHO because it hit a sweet spot -- simple enough for beginners, in fact, beginner-friendly, but due to having a good basic set of datatypes (lists, tuples, sets, plus the usual ints, floats, and complex) -- this allowed complex ideas to be compactly expressed. The ability to switch between functional and imperative styles also helped.

Python is a 'good enough' lisp. MIT switched, and Norvig has said as much.

No, the astonishing thing is that Python survived the 2->3 transition, and came out stronger on the other end. Language cleanups, new 'syntactic sugar' (e.g. @ as the decorator syntax), and what you see is Python is trying to actively steal all the successful programming paradigms under one unified syntax.

Is python perfect? Hardly. But it's beginner-friendly and expert-optimized. AND, unlike C++ (at least for me), you can get ALL of Python into your head at the same time. (Libraries, ok, but true in any language). In this specific sense, it is exactly like C (you can keep it all in your head, even the edge cases).

There are newer languages gunning for a piece of Python's mindshare (Zig, Nim). But because Python is a moving target: getting better and better, the others will need to provide a spectacular use-case advantage --- and I just don't see that happening.


Maz (ruby author) is nice and so we are nice, isn’t so bad. It’s twee sounding but saying you are going to follow the example set by the founder is absolutely fine. Is it any worse than the Python ‘benevolent dictator for life’ example?


I'm going to answer your question directly: No, it's not worse.

My interactions with Guido haven't been awesome. But the people put up with him regardless. The other people in python have been awesome.


I'm genuinely curious: what's "cringe" about MINASWAN?

(I write mostly Python these days, but have been involved in both communities for a long time, and MINASWAN never particularly stood out to me other than as a cute reminder to be nice.)


There is no problem with someone being nice. It's only a problem when they want you to be nice exactly like them.


It was meant tongue in cheek as I was defending Ruby being the most welcoming community. It's just a bit twee like the voice-over on London Underground - "See it, say it, sorted".


Oh I have met Ruby people and it's a big factor in why I never learnt the language.


> But Python is -dramatically- better focused, as a community, on finding a Pythonic way to proceed, and then advocating it, than previous cultures.

I would revise that to be that the pythonic culture of one acceptable way to to is better matched with a lot of good development practices.

Perl was also very good at finding a Perl way to proceed. It's just that with Perl that often mean a lot of implicitness and "do what I mean", and multiple ways to do it so you could fit it within your preferred style and tailor it to the current project's needs.

That all sounds good until you are confronted with the flip side of the coin, which is that it's harder to understand when looking at something written with those perspectives for the first time or after a long hiatus from the project, which puts a lot of importance on policy and coding standards which saps effort from just getting something done.

I love Perl, and it's still the language I write most often for work, and it's no great mystery why Python became preferred.


But Ruby took all the best bits of Perl so I'm still perplexed as to why Python "won".


Ruby did have a lot going for it about 15 years ago. Many Java/JSP people jumped ship and got on the ruby train. Ruby was a breath of fresh air comparatively.

Python had a great community though, and the Python Software Foundation went out of it's way to make people feel accepted in the community. And frankly programming is often more of a social activity than most people realize -- particularly for new programmers.

So new programmers tended to lean towards python, because the resources to lean on others were there. And people like Raymond Hettinger, Naomi Ceder, Ned Batchelder, and Yarko Tymurciak were approachable even if the BDFL wasn't.


What did the PSF do to make people feel accepted?


Taking a look at the bigger picture, it does indeed seem like the Perl philosophy lost to the Python philosophy overall.

Looking at the hip n cool languages, not just Python for scripting but surely Go and to some extent Rust as well for native stuff (Dart for scripting also but it didn't outright "win"), these are mostly languages that deliberately simplified things. Yes, even Rust - it needs to be compared to C++ and its biggest feature is basically that it doesn't let you do all the things C++ does, within a very similar model.

The only language that I heard is going against these trends (but I'm largely clueless what it's actually like) is Julia which sort of has its own "lay" niche like Perl did back in the day, and is mostly winning based on the premise that it's a performant scientific language.

The industry obsesses over costs of adoption, stability, maintenance; in short: how to get the most out of the least actual new development. It does make quite a lot of sense, to be honest, although sometimes it's really demotivating at an individual level.

And frankly, "learn the language, duh" usually comes up when the language is complex or unintuitive for no practical purpose. Of course there will be people who always complain if they have to learn anything but I don't think they make the majority, or even a significant minority, in the software engineering world. "Learning the language" is only as much of a virtue as the language itself serves its purpose, which includes easy of use and the readability of someone else's code.


Because the best bits of Perl were kinda trash that Python was smart to avoid.


Yeah for sure, the PERL influence is why I dislike Ruby. Makes it really hard to read if you haven't been doing it constantly.


Python's clean, obvious syntax is what drew me to it. They actively decide to not do things because it detracts from the cleanliness of the syntax. Very often, less is more. My biggest fear is Python might be forgetting this, which I see with the sort of things like := operator, etc.


God forbid we should have to familiarise ourselves with a language before using it.


God forbid things be intuitive.


Funny, because 99% of current regex libraries use Perl's regex extensions.


Not just practices, tooling. The whole typing system. Linting to check for mistakes. For the love of black.


Python's static typing still feels very very clunky and bolted on, and tooling around it was rather bad, like mypy. In general I definitely wouldn't say python tooling was good. It's improving rapidly with ruff tho, just as JavaScript when esbuild appeared.


Dumb question: I know the built-in typing (import typing) is limited in some ways, but it works pretty fine to cover most basic needs. So what does mypy add? I'm super interested in adding typing to some code we have but I'm just a bit confused by the choices available.

Would mypy with pydantic be a good combination or do they overlap?


The two are complementary: the built-in `typing` module only provides type annotations, not type checking. Mypy provides the latter, via the type annotations that you (and others) add to Python codebases.

Pydantic overlaps with mypy in the sense that it provides a mypy plugin, but otherwise it's just like any other Python library.


So mypy runs "at run time"? I guess that makes sense, I thought the annotations provided some form of checking too, but now I realize that I should really spend some time to inform myself better :').


Sort of -- mypy is its own standalone program, and you run it to typecheck your program's type annotations. It does some evaluation of the Python program (specifically imports and some very basic conditional evaluation, among other things), but it never really executes the codebase itself.


Mypy runs as a type linter/checker. See https://mypy-lang.org/


No, it's more that pydantic runs "at run time" while mypy not.


Isn't Python's tooling usually considered some of the worst?


By what standard? Yours? Java developers? Rust developers? The fact that Python has typing now, tools to check those, and has plenty of tooling around “make it faster”, I think your world view might be stuck in 2010. Python has gone from a slow obscure scripting language to a powerhouse. Conda, NumPy, Scikit, PyTorch, GPU programming, Games, Analytics, web apps, API’s, I think it’s safe to say this ain’t your grandpa’s Python anymore.


I have seen Python in action since 2010 plenty of times, including the present. It's been a mess every time. My worldview is driven by F# and Elixir, namely the `dotnet` and `mix` tooling, respectively. So no, Python's tooling does not impress me in the slightest. The first thing you need to do is get agreement on all the different checkers, linters, formatters. For Elixir, there's one tool for each task. My worldview feels quite current.

Those other things you mentioned aren't relevant to tooling. They're distributions, libraries, and applications. But note that Conda's existence was basically due to the fact of Python's tooling being poor.


You see tooling in a different view than typical Python users. Python users see things like iPython, notebooks, the ability to quickly do statistics and plot results, as tooling - to do their jobs. Data Science. Machine learning. Things that perplex and confound static-typing OOP purists. So yes, by your world view - Python is a mess. There's no one way to do things, there's no one tool, no one linter, no one formatter. I praise the fact that there isn't. What a boring world. What choice would you have if that tool didn't satisfy your needs? Find another language?

I began in that world view. The C/C++/Java/DotNet everything must have a standard, a fixation on a singular consensus. That's not how things work in the open source world of Python, javascript, rust, etc. Will there be gravitation towards a paradigm? sure, until such time that a new one emerges.

If you looked at Python in the 2.5 days and looked at Python today - You cannot argue the tooling has gotten better.


> You see tooling in a different view than typical Python users.

That doesn't surprise me.

I do what you mentioned with F# in Polyglot Notebooks (which support several languages in the same notebook) and Elixir in Livebook all the time, both of which are superior notebook implementations to the outdated Python Jupyter notebooks.

My worldview is not driven by comparing Python to C#, Java, C++, and other such ilk.


So following your logic, "some of the worst" includes the vast majority of actually used languages and tools. Must be cool to be on the special side. :)


They want to be lisp SOOO BAD! =)


+1 for mentioning that python's original competitor was Perl. This point is forgotten some 20-30 years latter


Not just perl, but C/C++/Java as well. Ruby was a competitor to Java JSP development back in the day. And I remember when a lot of Java people jumped ship to Ruby. I moved from C++ to python over a decade ago and never looked back.

Back then, the python jobs were scarce -- but based upon how I picked up the language, many of the typical C++ issues just disappeared -- and I knew it was going to become popular.

One comp sci professor talking at a PyCon years ago made a point that maybe the best college level introductory course probably should not have been SICP based but Python. His example was that for the first time in 5 years of teaching intro courses he had people coming up to him looking to change majors.


C++ and Python are not competitors. Sure both are Turing complete so you can implement anything in either. However if you should is a different question. Python is very difficult to maintain when you program goes over 100k lines of code, while the static type system of c++ is good for millions. C++ compiles to fast binary code, while Python is typically 60x slower for logic (though often Python calls a fast algorithm implemented in a compiled language for anything significant so this is hard to measure in typical programs). However if your programs are not very large and you don't need every last bit of speed Python is going to be easier to work with. (And frankly if not working with a legacy c++ codebase you should look at rust, ada, go, or something else i'm not aware of, c++ has many ugly warts that you shouldn't need to worry about)


As a person who uses C++, I must say something that also applies, somewhat, to Java.

We have all the cool kids like Kotlin, Rust, etc.

However, when it is about finishing a project, beating C++ or Java ecosystems is almost impossible.

Besides that, C++ has improved a lot over time. It still has its warts and knives but you can code very reasonable C++ with a few guidelines and there are also linters, and -Wall -Wextra -Weverything -Werror. That takes you quite far in real life provided you have a reasonable amount of training.

I would choose C++ over Rust any day. The only hope I have for C++ successors are Cppfront and Carbon and they are highly experimental. As successors those two fit the bill.

There is a third one, my favorite. It is this one: https://www.hylo-lang.org/ but I am not sure how compatible it will/would be.


The one thing Rust is getting right that I hope Carbon et. al take from it is using the type system to manage memory.

not having to explicitly remember `free` in safe Rust code is amazing. Knowing that if my types are sound that memory will be managed reasonably is great.

I also think that immutable by default, mutable by explicit declaration is pretty great.

I do think there is a lot of room to add better ergonomics on these ideas however


The amount of complexity that Rust adds is not worth in most scenarios in my opinion. I can think of Rust as something for OS with critical safety or so.

Besides that, in real life you end up having C in most of your software, so I am not sure how Rust behaves compared to well-written C++ with all warnings and linters on and using smart pointers. But my hypothesis is that they do not stay very far apart in safety.

There are many ways to achieve safety, and the most promising, IMHO, is the path that Hylo programming language is taking. It sticks to Generic programming and mutable value semantics.

The language base draws from Swift and it has very strong people behind that know what they are doing. For example David Abrahams and Sean Parent. This is an implementation in a language of many of the ideas from "Better Code" from Sean Parent. It has a very solid foundation.

Besides being a solid foundation for generic programming, value semantics, and concurrency, what I like the most is how much it simplifies the model by not escaping references all around and preventing unnecessary copies. This removes the (IMHO) mess that is to have to manage reference escaping constantly in Rust, reason why many patterns such as Linked lists are not even possible.

And Lists are not just an academic exercise, as I was told sometimes by Rust proponents. Linked structures (Inmutable linked structures actually) are important in scenarios such as TelCo backend where you need replication, fast moving of data and history versioning, rollbacks and so on.


> The amount of complexity that Rust adds is not worth in most scenarios in my opinion. I can think of Rust as something for OS with critical safety or so.

It's difficult to have this discussion in any sane way when Rust (or C) comes up. I tried Rust, but I have projects to deliver on strict timelines and I have yet to find a client who is prepared to pay me for the (what I found to be) very large onramp time to gain deep Rust expertise.[1]

The argument of "just get gud" whenever you point out the deep learning curve of Rust is pointless; I have noticed that Rust experts only come when your employer is rich enough to pay the team to not deliver while they learn it: Basically only FAANGs and startups flush with VC money.

[1] When I have a small project I reach for C. When I need something bigger for which C is not suitable, I don't reach for C++, or Rust, I rather take the tiny performance hit and move to Go. On extremely large projects, where I work with others, C# and Java seem to hit the sweet spot.[2]

[2] Although, C# and Java are also getting a bit too complicated for my tastes too. Seems to me that every language follows C++ evolution towards complexity, because the people stewarding the language are experts in that language and/or in programming language theory.[3] They are comfortable with the changes they propose because they have no need to upskill (they are already at that skill level).

[3] I propose that a language designed by your average corporate developer who has 15 years of experience but no CS degree will have much higher adoption than languages designed by PL purists.


What makes you choose Go over C#/Java for medium projects? And why not go for the large projects?


> What makes you choose Go over C#/Java for medium projects?

Because I said:

>> C# and Java are also getting a bit too complicated for my tastes too.

I abhor complications.

> And why not go for the large projects?

Because I said:

>> where I work with others, C# and Java seem to hit the sweet spot

Yeah yeah, I know it sounds like I am whining (Maybe I am :-), but at least I am complaining about all of them.

Java and C# do appear to have been battle-tested for very large projects that aren't microservices.

Go? I dunno. I've only ever seen very large projects in Go using microservices. I like its simplicity.

My main complaint is that programming languages have too much minutiae to track that I really shouldn't have to be tracking.

Take, for example, asynchronous builtins:

Why are all the explanations wrapped in jargon that only an expert in the language would grok immediately? Promises? Futures? You gotta explain those, with examples, before you can explain what to do with a value from an async call. Go look at the popular introductions to async (say, on MDN for js, or Microsoft for C#, etc) and count how many times they have to explain something because of their leaky abstraction implementation rather than explaining the concept.

How about simply saying "calling async functions only schedules the function for later execution, it doesn't execute it".

That naturally leads into "So you need to check if it is finished using an identifier to identify which scheduled call you want to check"...

Which itself naturally leads to "The identifier you need was given to you when you scheduled the call"...

Which leads to "Using that identifier from `id = foo();`, you wait for the result like this: `result = id.wait()`".

You can even add "You can return that id, or pass it around so some other code can do `id.wait()`".

Now they don't explain it this way, because their implementation(s) is more of a leaky abstraction exposing irrelevant information about the compiler, than of a straightforward implementation of the concept. They are unable to separate the concept from their implementation.

The common implementation of async is so counterintuitive that they have to explain their particular implementation instead of the concept, because it has almost nothing to do with the concept. If they just explained the concept, programmers would still be confused because the implementation needs things like colored functions just to work poorly.

The concept of scheduled functions (which may return once, or may yield multiple times before returning), which is a simple path to understanding asynchronous calls, is completely divorced from the implementation which will produce errors like "cannot call asynchronous function from a top level"[1] or "cannot call await in a function not declared as async".[2]

So, yeah, I'm kinda upset at how programming has evolved over the years (I wrote my first program in 1986, so had a good seat for the later part of this particular movie), from "simple and straightforward", to "complex for complexities sake".

[1] Why? Because their abstraction is an abstraction of their implementation, and not an abstraction of asynchronous calls.

[2] Se [1] above.


> How about simply saying "calling async functions only schedules the function for later execution, it doesn't execute it".

This is not always true.


All this may be true (I'm not the strongest C++ developer in the world, relatively limited exposure), however the Rust memory management via the type system feels natural once you wrap your head around it. That idea is really good. I always hated dealing with `delete`, `free` and `malloc`.

Being able to offload all that busy work to the type system is just nice. There are definitely ergonomic improvements that could be made around this.

All the rest? I'll leave that to someone else to talk through, as I'm no expert here.


I.write c++ all the time, and I go months between needing new or delete. Unique_ptr is a wonderful thing. Not quite as powerful as rust's borrow checker, but it saves me a lot of thinking.


> however the Rust memory management via the type system feels natural once you wrap your head around it

It disallows many valid patterns. That is why I recommend to take a look at Hylo programming language (before called Val lang) to see what I think it is a very good example of how to make a language safe without making the learning curve so steep and without a need for a GC.


The way Rust does it may disallow valid patterns, but it is not inherent to the idea


> well-written C++ with all warnings and linters on and using smart pointers. But my hypothesis is that they do not stay very far apart in safety.

Can C++ compilers + linters reliably detect all misuses of unique_ptr? Because that sounds like a halting-problem kind of problem, and as soon as you can't guarantee memory-safety, you're definitely not in the same ballpark in terms of safety. I mean, memory-unsafety is the number one vulnerability cause in software. C++ has many qualities, but safety certainly isn't one of them.


> Can C++ compilers + linters reliably detect all misuses of unique_ptr? Because that sounds like a halting-problem kind of problem, and as soon as you can't guarantee memory-safety, you're definitely not in the same ballpark in terms of safety.

Is C and assembly the same level of memory safety? Probably yes... but no, it is not in practice.

And C and C++? Yes, in theory, in practice... C++ is safer.

How about Rust? In theory Rust is safer. In practice, you are going to use C libraries here and there, so... in practice not as safe as advertised.

Well-written C++ with -Wall -Werror, -Weverything, -Wextra... that is very safe, including detecting even dangling stuff to some extent (gcc-13). If you stick to `shared_ptr` and `unique_ptr` no matter how much you complain about it: Rust with its C shims and C++ with all linters and a good environment are practically at similar levels of safety.

This is the practical, real thing that happens. I do use C++ for every day use for around 14 years professionally and 20 years in total.

You are all in the terrain of theory, but how much Rust and C++ have you really written?

Of course, the CVEs data about memory safety, well, those are true. And they are a real problem. But with a reasonably good use of C++ those would be much, much, much lower than they have been so far.


> However, when it is about finishing a project, beating C++ or Java ecosystems is almost impossible.

Yet, somehow people do this with python, perl, and ruby. Google hires professional python people too.


Not only do this, but do it way more successfully. I'll never get tired of repeating that among top YC startups, Java as a primary language contributes to roughly 1% of value, while Python + Ruby are almost at 70%.

https://charliereese.ca/y-combinator-top-50-software-startup...


If by successfully you mean time to market, for sure you are right.

C++ gives more return when you start to save in infra because you have a more efficient language, if coded properly. Same goes for Go vs Python.

The right tool for the right job. I would use (and will, I am on it) Django for a SaaS that I have for the backend. If things start to work relatively well, then, I will keep identifying, if there are, bottlenecks and migrate with a hybrid architecture parts of the workload to C++.

This allows me to save in infrastructure bills.


Hylo page says it was formerly Val.

IIRC, Val has been mentioned on HN sometimes earlier.


As painful as the Python package system is, C/C++ is so much worse.

I remember trying to compile GTK+ on a solaris system a decade ago, and remembering how terrible it was to even to get it to compile.

You're really deluding yourself if spending your time in compiler dependency hell is so much better than python.


The new CMake package manager may make things easier.


> Python is very difficult to maintain when you program goes over 100k lines of code, while the static type system of c++ is good for millions

I see this argument a lot, but people often forget that Python is very concise (yet readable) compared to other languages.

100k LOC in Python typically contains way more business logic than C++, so it is only natural to be harder to maintain.


Python ismuch less concise at this size. Sure your algorithms are more concise, but you lose all of that and more because it you don't have 100% test coverage you never know when a stupid typo will make your program crash, while with c++ you typically can be fine with more reasonable coverage, say 80% where the last 20% is things hard to test that are unlikely to fail anyway. At that scale Python is slower to build as well because c++ lets your tests depend only one the files that changed and this your code-build-test cycle is faster despite c++ being famously slow to compile.


MIT dropping SICP/Scheme for Python conicided with the general decline of education as an end in itself. Python is the VHS of computer languages. I couldn't believe it when I heard MIT dropped a language renowned for its elegant lambda implemenation in favour of a language in which lambdas were not only frowned upon but literally throttled. I think it says everything that MIT ditched Scheme and SICP at the same time as it would be near impossible to teach SICP with Python.



I would argue that Python is the Betamax of computer languages, and C++ is the true VHS.

Fight me.

But seriously, Python is good. Don't let the perfect be the enemy of good, only computer science people can do this.


> the best college level introductory course probably should not have been SICP based but Python

i saw this linked on here recently: https://wizardforcel.gitbooks.io/sicp-in-python/content/


And were of cause forgetting VisualBasic because who cares about microsoftland those days but back when python/ruby emerged even windows server and IIS was relevant as this was kind of the peak of microsoft's dominance.


Yes, it is pretty weird to think that VB.Net flavor was fairly used in web dev even as late as 2006.


> There should be one-- and preferably only one --obvious way to do it.

I just wish Python applied this approach to package management. It is needlessly complicated.


The obvious part is definitely lacking, but the long and short of it is basically to ignore all the newfangled solutions. They are much more trouble than they are worth. Just use pip with a venv and a requirements.txt (easier) or a pyproject.toml (cleaner).

I really fail to see what the newer tools actually bring to the table. From my experience they solve no actual problems and just introduce more bugs and weird behavior.


I think Python has some of the worst API documentation I’ve ever read.

Even Java puts it to shame and that is sad


Coming to Python from PHP it was interesting to see that in the PHP world I'd get 80% of my knowledge from the PHP documentation and 20% elsewhere. In the Python world it's easily the other way around. It also doesn't help that the PHP documentation is more up to date so I am using the most current info, while for Python their own docs are so bad I rely on other docs but they all vary on what version of Python it depends on and whether it follows current best practice. The difference is night and day and one of the reason I ended up going back to PHP.


Yeah the PHP documentation is extremely pragmatic and seems designed to get you going quickly with useful examples.

The Python documentation seems to be suffering from some sort of weird snobbery. Very wordy as another comment mentioned. Examples are frequently lacking or inadequate. They seem like they're trying to compete with MSDN in "professionalism" although these days even the MSDN examples are better. There is an entire ecosystem of python tutorial websites around the web that would not exist if the documentation was as helpful as that of PHP.


The stdlib documentation definitely has a unique flavour. I would characterise it as "wordy".


I disagree: Python seems to have a zillion build tools and multiple competing type-checkers.


I like Python, but most of the Zen has always been a meme, and not a guideline of design principles of either the language, or software written with it.

Besides the one you mention, I also find the "Explicit is better than implicit" line to be against everything Python stands for. The language is chock full of implicit behavior, and strongly encourages writing implicit code. From the dynamically typed nature of the language itself, to being able to define dunder methods that change the behavior of comparison and arithmetic operators.

I really like Go partly because of this. Not only does the language itself stictly follow the explicit over implicit and TOOWTDI principles, but it encourages or even enforces them on the programmer.


It is absolutely not a meme, it's [PEP 20](https://peps.python.org/pep-0020/). Just because some people don't take it seriously, it's definitely a part of the language's soul.


I think The Zen was really important for Python's success as well. Having your core values right there, well defined and out in front, wasn't something you get with a lot of languages. Any language thats been around for a long time gets sorta.. muddled.


I'm from the outside looking in so don't take me too seriously because I tried Python and bounced off of it; there's very likely elegant parts of the language that I never really internalized because I didn't spend enough time working with it.

But speaking as someone who tried Python because I agree with principles like "have one right way to do things" and "be explicit", my initial impression working with language is that much like real souls, Python's soul is metaphysical, unobservable, and doesn't seem to interact much with the physical world in an observable way ;)

If I had to list some of my main criticisms of Python it would be that the language seems to have way too much implicit behavior and seems to have way too many ways of doing everything. I'm going to say something heretical, but it was weirdly enough early Javascript that I found to be a lot more consistent and explicit[0]. Type casting was a disaster of course, dates and the standard APIs were a complete mess, but beyond that it was rare for me to look at Javascript code and think "I have no idea what the heck that is doing." But it happened all the time in Python, it took me a while to get used to things and I still feel generally less capable in Python than I do in other languages that I've spent less time working with. There's so many little syntactic tricks in the language that are... convenient, but I resent having to memorize all of them.

[0]: Until it started messing around with classes and const and Symbols and crap -- the language is probably much harder to learn now than it used to be in the past, but I don't know because I'm disconnected from new users now. But certainly having 3 ways to declare a variable now probably doesn't help new users.

----

As an example, just this weekend I tried to convert a Python codebase from 2.0 to 3.0 and was immediately hit by needing to resolve implicit casting rules about integers, buffers, and strings that were all different now. Python has this weird thing where sometimes it does implicit casting behind the scenes and sometimes it doesn't? There's probably a rule about it, but it's never been explained to me.

So then I wanted to figure out the best way to handle converting a string of hex values into a manageable format so I searched that up and got advised that I should use `struct.unpack` except to be careful because that would give me a tuple instead of a list and also would require me to pass in the length, so instead I should actually use `map(ord, s)`, which prints out something that certainly seems to look like a list, but is not subscriptable, which is not a thing that I knew that I needed to care about but apparently do because the program broke until I cast it back to a list. And probably what I should have done was list comprehension from the start? But it wasn't clear to me if I could do list comprehension on a string or not since I do know that strings in Python technically aren't lists, they're sequences, and anyway list comprehension was not what people were suggesting.

And I know it's unfair because this is very beginner stuff in the language, but my immediate thought was, "oh right, Python. Of course when my debugger prints out something that looks like an array of values it might be one of 3 or 4 different types behind the scenes, all of which will error for subtly different reasons. It was silly of me not to see this coming."

Again, fully aware that this is basic stuff that would completely go away with familiarity with the language, but like.. oh my goodness my kingdom for having one array type that just works everywhere and one iterable quality that supports the same manipulations everywhere no matter what the underlying type is. I'm trying to do quick scripts, if I cared about these distinctions and if I cared enough about performance to need multiple ways to have a list of values, I'd have written this in Rust or at least C# or some fully typed lower-level language. There doesn't need to be this many ways in a scripting language to say "I have an ordered collection of values."

I'm not saying you're wrong, I suspect you're right. I suspect the underlying language is much more elegant than what I'm seeing. All I'm saying is just that the initial impressions of Python for people like me who are really inexperienced with the language are anything but the PEP 20 list -- the impressions are the opposite, it's exactly why I bounced off of Python so hard. And I don't think that's individuals doing something weird, that seems baked into the language? Individuals didn't give Python 4 different ways to represent a sequence of values. I don't think it's a few coders' fault that I'm constantly seeing syntax in Python where having the code look prettier seems to be the priority over making it understandable or explicit? Again, take it with a grain of salt, just... I don't know, I always laugh when I see the PEP 20 linked because it's so contrary to how I think the language looks to new users. I could compare this to something like Lisp, which I am also extremely inexperienced with and extremely bad at writing, but when people talk about Lisp having simple rules, I think, "yeah, I see that. I see the system that you're talking about and I see the consistency you're talking about." With Python I just don't see it, the initial impression makes it feel like a language written by graphic designers trying to create something that's pretty rather than systemic.


For what it's worth, I have come around to really enjoying the more recent versions of python, and it's what I'm writing most at the moment, but I totally agree with you here. I don't think of python as being explicit and having only one way to do things. I think that's a pretty inevitable result of trying to both add new things and also keep older things for compatibility.


I code both.

> explicit over implicit

Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly. C/Python/Java/C++ didn't have this issue. Null was always null in each of these languages.

Further, go panics in surprising ways sometimes. A library developer might decide to panic and cause your program to crash, when that library was only used in your program for optional behavior.


> Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly.

Maybe I'm reading it wrong, but it reads that you have been accustomed to implicit nil type conversions in other languages and that tripped you up when Go required you to be explicit. It seems Go is the explicit one here.

> A library developer might decide to panic and cause your program to crash

Go is abundantly clear that panics should never cross package boundaries except in exceptional circumstances. And those exceptions should crash your program. An exception means that the programmer made a mistake and the program is now invalid. There is nothing left to do but crash.

It is technically true that a library developer can violate that guidance, but it is strongly encouraged for them to not. The parent did indicate that sometimes the explicitness is only encouraged, not enforced.

Not that Go is some kind of explicitness panacea. It has its own fair share of implicitness. But I'm not sure these two are representative.


> Go was the first language I used that broke an explicit check for nil/null/None. I had to cast my nil to a different kind of nil in order for the program to run correctly. C/Python/Java/C++ didn't have this issue. Null was always null in each of these languages.

I think it makes sense - a `nonexistent<ThisType>` is different from a `nonexistent<ThatType>`.

In C, Java, C#, C++ and other languages, null/nil is a value. In Go null/nil has a type. Now that I've used Go a little, I actually miss having error messages for certain things, and comparing a `nonexistent<ThatType>` to a `nonexistent<ThisType>` is actually a logic error in my code that I want to be warned about.


I don't think "you can define + on custom data types" to be the same as implicit behavior.

"Explicit is better than implicit" is more around things like Django not just automatically importing every directory it sees (instead requiring you to list the apps used). It's also a decent rationale for changes made from Py2 to Py3 regarding bytes-to-string conversions needing to happen explicitly.

It's also about how usually you end up explicitly listing imports (wildcard imports exist but are pretty sparing), or lacking implicit type conversions between data types.

This stuff is of course dependent on what library you are using, the projects you are working on, etc. And there's a certain level of mastery expected. But I think the general ideas exist despite the language allowing for much more in theory.


This held up much better in the earlier days of Python.

Sooner or later every sufficiently popular programming language is confronted with the dilemma of either breaking backwards compatibility and/or adding "alternative ways to do things"

Pathlib is an interesting one. It even has a correspondence table [1]. The introduction of pathlib made sense because it is just much nicer to use. But you couldn't drop the equivalent functionality in the "os" module for backwards compatibility. It's just far too widely used.

The is no magic bullet for this one. Either you accept introducing duplication in your language or you go through a hard change (f.ex. the Python 2 to 3 change).

The softest way to migrate might be to flag functionality as deprecated and give users a fairly long time (talking years) before dropping it. But no matter how much time you give the users there will be users who won't change until it's too late.

So there really isn't a winning path for this one.

[1]: https://docs.python.org/3/library/pathlib.html#correspondenc...


That was the one mantra from the zen of python that I always laugh at too.

Because my #1 complaint with python is that there are so many ways to do the same thing. It sounds nice in theory if you are a new developer because you can create your own voice in python and character.

For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

By comparison if I gave the same puzzle to some Go engineers, all of the responses would be probably very close to identical.

This sounds fun as a new engineer, like this is an argument for using Python. But as you get older and more experienced you will likely find yourself hating the flexibility of python because as you scan any decent size codebases, you start to be able to identify entire parts of the codebase and who wrote them, without even needing to do a git blame.

I am currently an SRE manager who works in a very polyglot environment (primarily Node/js, bash/sh, python, and Golang with just enough java thrown in to ruin your day). When I read through python codebases, I can identify exactly who wrote it, even past employees. Then when you need to fix it, you often have to adapt to that style as you fix a bug or add an updated feature. This is a little true with Bash because you will see certain engineers always lean towards sed and others towards awk and grep to solve problems depending on their strength, but it is less significant. However, in our Go codebases, everyone writes nearly identical code.

I've been writing Python for over a decade and I still learn new features of the language every week or month. Just this week I had to dive into the `ast` library (abstract syntax trees) which was a native module I haven't ever touched before. By contrast, I haven't had to learn any new core syntax and tools of Go and Bash in a long time. You have a fairly small set of syntax and that's it. The power comes in how you use that small set of tools, not learning a new language module.

Again, the infinite flexibility sounds nice. But in a corporate environment the strictness of other languages is actually a benefit because it keeps everyone following the same path.

I truly believe the mantra from zen of python:

> There should be one-- and preferably only one --obvious way to do it

Sadly, Python lost this tradition far before I ever became acquainted with the language. And the python3.10+ mentality of supporting everything forever is only going to make it worse overtime.


Either you move with the times or you become obsolete. 20 years ago Python codebases were clean and consistent, but language design has moved on, and Python has - barely - kept up with it, so now you have people who know the new ways and people who know the old ways and various stages in between (and it's not like they didn't try ditching backward compatibility, but that didn't work out well either). Go has the luxury of starting 20 years later and being a lot less ambitious, but it'll happen to Go too in time.


> For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

I get what you're saying, but I don't think that's as true as you mean. I think that most experienced Python developers tend to gravitate towards the same style of solutions for things like coding puzzles (including the classic "read in data, map it through a comprehension, write out data" pattern).

There are multiple ways of expressing the same thing, but very often in a specific context one of the ways tends to be overwhelmingly a better fit than the other.


> For example, lets say I gave a simple coding puzzle (think leetcode) to 10 python engineers. I would get at least 8 different responses. A few might gravitate around similar concepts but they would be significantly different.

This is true for any language. Arguably, what's different about Python is that the more senior the engineers you're interviewing, the more likely their solutions to converge.

Just because something is in the Zen of Python, doesn't mean it automatically gets followed by every Python engineer. It's Zen, after all - you don't automatically get enlightened just by reading it.


There's generally one obvious way to do it. There's almost certainly other ways to do it, and those other ways might be better suited for different specific tasks, but the obvious way tends to exist and be widely usable.

Are there some exceptions? Sure.


I'd suggest "pythonic", rather than obvious, for that sentence. It's one of the ways the community reminds itself of the utility of consistent discipline.


It's perhaps worth noting that the next line of the Zen of Python is: "Although that way may not be obvious at first unless you're Dutch." (I.e. unless you're Guido.)

So it was a bit of a tongue in cheek statement from the very beginning. :D


IMO, its not “hilariously wrong” in Python.

Note that as written the priority is that every use case has at least one obvious approach, and that a secondary priority is that there should be a unique obvious approach.

People often seem, however, to misread it as “there should be only one possible way to perform any given task.” Which, yes, is hilariously false for Python, but also not what it says.


Yeah. It is now.

It didn't used to be though. People would wax poetically about how pythonic some codebase was and sing the praises of idiomatic python. There was an ideological war between Python ("there should be one--and preferably only one--obvious way to do it") and Perl. ("there's more than one way to do it" or TIMTOWTDI, pronounced "Tim Toady")

Generators and comprehensions and decorators and `functools` are all relatively new.


I feel like the entire industry has adopted Python's approach here, so probably Python doesn't stand out on this point as much as it did in the early days.

Compare Python to C/C++, where even just getting a working binary from source is fraught with many competing tools. Or boost vs std and whatnot.

Others have already pointed out the contrast with Perl.


There is always one obvious way to do things in Python, but it happens that what's obvious varies from person to person.


Like those various personal subsets of C++ :)


As evidenced by the unified Python library / dependency management system.


I use pip 100% of the time


>> I use pip 100% of the time

What about pipenv, poetry, conda, setuptools, hatch, micropipenv, PDM, pip-tools, ActiveState platform, homebrew, or your Linux / BSD distro's package manager?


As much as I love to rag on things, I would go so far as to say that the big problem with Python packaging is the fact that it tries to manage C/C++ packaging and integration.

If Python is only managing Python code, the issues to be solved are VASTLY simpler. Just about anything works.

Once it has to deal with dynamic libraries, compiled code, plugins, build systems, etc. the combinatorial explosion just makes life suck.


Python is a victim of its own success and versatility.

It has become a better "glue code to solve your problem" than many other solutions and so people want to use it everywhere for everything.

It gets packaged in many different ways because it gets used in many different ways for many different purposes.


Packaging is a difficult problem and all attempts to simplify things fail to understand how complex the problem is and so fail in some way. (Attempts like .deb do okay by only focusing on a subset of the problem)


I use none of those except for homebrew, but I didn't mention it because it's for installing complete programs that happen to be written in Python, not Python dependencies for when I'm working with Python.


You forgot about egg!


pip (with venvs, which is a built-in Python feature) covers 99% percent of all use cases.

Yes, its dependency resolution could have been better, and lock files are nice (which can be emulated using constraints [1]) but I don't understand why people are so busy writing alternatives. I work on pretty sophisticated codebases on a daily basis and haven't used anything but pip.

[1] https://pip.pypa.io/en/stable/user_guide/#constraints-files


Good for you but many don’t. Most people I know use Anaconda and install things with Conda


Use Poetry to produce a lockfile (or whatever lockfile-producing tool you like), and then let Nix take the wheel. It solves the dependency problem not just for Python, but for your entire stack. I've been able to hand it off to my coworkers and have everything just work with a simple "nix-shell" command, even with a dependency tree chock full of gnarly ML libs.


Whatever in the NamedTuples do you mean?


There is also TypedDict, Pydantic, Attrs and dataclasses


Yep, but couldn't be bothered typing them all :D


Far faaaar better with respect to this than R though at least.


You're young enough to have never dealt with write-only Perl code ...

Be glad.


> You're young enough to have never dealt with write-only Perl code ...

How young does one have to be for that to be true?

As recently as 2017 I was employed at a place where there was a significant amount of perl code, as part of the build system (ugh) for the C code, and generating html (for some other system).


Back then (in 2004) the most popular programming languages were PHP, Perl, C++ and Java. Java was pretty focused (and even then, there was the distinction between int and Integer, between []int and ArrayList, etc.) but C++ and (especially) Perl were driving people crazy because there were thousands of ways to do the same thing. And nobody had any respect for PHP's design so let's not even talk about that one.


Not really. For the core language this applies. This does not extend to 3rd party libraries, obviously, since anyone is free to reproduce whatever someone else already doing better, if they want.


I like also the idea of explicit>implicit, but then you see Django.

I hate significant whitespace though. I can work with it, I get it, I still don't like it.


That was then. Python does much much less explicitly anymore these days, either.


> I think Python was popular as a general-purpose language first.

That matches my memory as well. I can't find any references, but I seem to remember a quote going around way back circa 2008 that goes something like "Ruby is popular because it's the language used to write Rails; Django is popular because it's written in Python."


I think this is it. Love Ruby as a language. Have grown to hate the way Ruby developers write stuff. Stuff ends up with so much abstraction and indirection. End up yearning for a simpler time when you could write code that just did what you needed it to.


I've seen that in Rails projects too. They're often way more complicated than they need to be.

My rule of thumb is for a medium or large project that one probably shouldn't mess with the dynamic and meta bits of Ruby unless writing a framework and maybe not even then.


100%. You can absolutely still just use off-the-shelf Ruby-the-language to get things done, without involving Rails at all, but at this point I think it would be just as weird as using awk or perl for writing a command-line program.

I love tons of things about Ruby, like first-class/syntactic symbols and especially the pervasive and intuitive (in my opinion) use of blocks in the standard library.


Delving into a gem sometimes makes me lose faith in ever using dependencies again.


There are just too many ways to do things in Ruby. How many forms for an if-else or for loopin can you name in Ruby? Just as the simplest example.

Monkeypatching is also awful for readability.

Explicit imports are way more readable than things appearing into current namespace implicit kind of stuff like it happens with Ruby


> There are just too many ways to do things in Ruby

I think this is not completely true. Ruby is conceptually very simple. It's not C++ or Perl. Everything is an object, and everything is done through method calls. There is also extensive support for functional programming, so iteration is performed via filter/map/reduce and similar high order functions. Besides, the block/proc/lambda design makes it easy to turn APIs into DSLs.

For those reasons, Ruby gives the closest experience to programming with Scheme I have ever experienced outside the Lisp world.


I've been writing Ruby daily for years and never written a `for` loop


Started Ruby in 2007 and have used it professionally essentially nonstop. I have also never written or seen a for loop. I don’t even think I could correctly guess the syntax of it.


Monkeypatching exists in both ruby and Python.


It does but it's generally considered a bad practice in Python (and many other languages) and a feature in Ruby.


Two: `for` and `each`. Most people use `each`.


Python fixtures anyone?


I remember entering in Python because you have slices and other syntax sugar stuff that makenyou think other programming languages were making you life miserable on purpose!

Also, the batteries included in the standard libs was incredible when you needed to use the ACE/TAO monster in C++.

Finally, interfacing with native code via SWIG enables you to quickly use critical optimized code.

Obviously, other programming languages captured Python power since then.


This sounds like an idea without any real data. For people to prefer one they had to try both and I guess especially data science people just took the most common language in their field.

People seem to forget or miss that it took many, many years for Python to become popular outside data science. If it had just been clearly better or easier that transition should have gone much faster.

As someone else mentioned, at one point universities started using Python as an introduction language. I am still sad that they did not choose Ruby or js but here we are.


I don't buy this at all. I think it's path dependent. The language differences aren't big enough to dominate over ecosystems and libraries. Ease of integration with C can make a bigger difference than significant whitespace. Etc.

Personally I think Python is an exceptionally ugly language for one as popular as it is (the magical underscore identifiers really bug me, and I think list comprehensions are deeply inferior to monadic chaining - there's a reason nobody copies them but everyone got LINQ envy). But it's clear from a perusal of code in the areas where Python dominates, data science and machine learning, that aesthetics are very far from people's minds. They'd be using Javascript if it had the libraries available.


>I think Python was popular as a general-purpose language first.

What were their choices though, Perl? It's easy to see why Perl lost out. Other than PHP, I don't really know of any other JIT scripting languages they could have chosen.


I knew about Python in 2006 or so when got into Linux, and at that time (when Python 2 was a thing, iirc it just came out) it was very popular between the FOSS world for doing apps and GUIs (I even toyed a lot with PyGTK), whereas I felt Perl was much more about more "serious" stuff like text processing and kind of a sh language with steroids. I just barely heard about Ruby and wasn't sure what it was its purpose - I just heard about its "gems" but not about Rails. Still as both Python and Perl were FOSS I supposed their niche and user base were going to be around it, as many things FOSS at the time.

At 2008 I started my graphic design studies and I just pretty had to forgot about programming (which in hindsight if I had kept doing it I would have a very strong programming background and maybe my life would be much better now, but it is what it is) - but was very surprised to discover around 2011 or so that it seemed _everyone_ was using Python. Like I just blinked and it took over the world somehow.


The other strong contender at the time was Tcl, especially when combined with the graphical library as tcl/tk. Python implemented tk as well given it's popularity.

Tcl's extensibility led to expect, which was very useful for automating scripting over telnet.

https://en.wikipedia.org/wiki/Expect


Tcl also influenced many Microsoft Powershell aspects.


There were always other choices. Lua is the main one that comes to mind.

The point is that data science use came quite late in the Python world, and the increase in users due to it is incremental. Python was already at 3.x before the ML world adopted it. If the ML world picked another language, Python would still be in the top 5 languages.


You arbitrarily restricted the scope. Java is what I saw.

I was in college 2006 - 2010 in CS, and while all the introductory courses were in Java, by 2008 or so a lot of the other students had switched to Python on their own, for projects where either language would work. Didn't really see anything else, just Java and Python.


No, it wasn't arbitrary. A JIT language is much easier to pick up than something needing a compiler and an executable. The focus on a scripting language was deliberate

Edit: turns out the term I'm looking for wasn't JIT but an interpreted language.


Perl and python have opposite philosophies with regards to standards. Python prefers a standard "pythonic" way, while perl had its "there is more than one way to do it".

It would seem that having a standard is more popular.


Yeah, there's the ongoing Perl joke about writing a script that works today, but not understanding how it works tomorrow. Too much one-liner type stuff that did not allow for maintainability


That’s any code I haven’t looked at for a while, to be honest. I can’t count how many times I’ve looked at code I wrote or bug tickets I fixed and have absolutely no memory of doing it. It’s almost like the act of committing flushes the local storage in my brain.


I run into this problem as well I'll often come across something I wrote a few years ago and struggle to remember why I wrote it that way.

I've learned to add comments to my code - from what I see commenting code is frowned upon by a certain subset of developers but I've taught myself that whenever I am doing something subtle or unintuitive to add a short comment explaining what the code is doing, for every potentially unnecessary comment I've added I've also saved myself time when I've had to come back to something months or years later and been able to refer back to the comment.


> from what I see commenting code is frowned upon by a certain subset of developers

I am not able rightly to apprehend the kind of confusion of ideas that could provoke such a... belief.


The main argument seems to be comments should left with the commit message and not "inline" i.e mixed with the code. Personally I make heavy use of inline style comments.


I've worked in tech my whole life and never heard a dev make these comments are bad claims. I only work in the boring world though. Is this a startup or FAANG or silicon valley thing?


In clean code there is a chapter about eliminating useless comments. I think this has some merits. Think of the following made up example:

sum = sum_two_numbers(number1, number2)

Probably doesn't need a comment.

The other argument is that comments need to be maintained and are subject to decay. E.g. the code they are commenting on changes.

The book goes into other examples but i think the idea isn't to eliminate comments but to be thoughtful and judicious in which you use


Perl is different as people really tried to do this on purpose with maximizing the concept of one liners. Maybe it's a PTSD kind of an effect? "Any" code or good code doesn't try to be developed in a way that is difficult to parse by the next person. Perl developers definitely didn't adhere to the policy of writing code like the person maintaining it is a serial killer that picks people that write unmaintainable code and they know where you live.


> Perl and python have opposite philosophies with regards to standards. Python prefers a standard "pythonic" way, while perl had its "there is more than one way to do it".

It wasn't just "having a standard" that mattered; in Perl it was actively encouraged to find a different way of doing something.

"There is more than one way to do it" was often taken by the Perl programmer as a challenge to find all the other ways, and then use them all in different parts of the same program.


Icon? Smalltalk? Dylan? Scheme? Common Lisp?


Perl's actually excellent at processing unstructured data, and it had a strong foothold in bioinformatics for a time. I don't think the decision was as obvious as it looks.


This is true. Bioinformatics was full of Perl scripts for a variety of (text) analyses. I however remember well that many students began to hate it soon after working with it, as it was very difficult to understand existing code. So, when given a choice, many choose Python as an alternative. And stayed with it.


For most people the thing that is hard to understand about Perl scripts is the regexp code. However, regexps looks more or less the same in any language. But, the thing is, most Perl scripts process things such as log files and similar data. Which makes the scripts highly dependent on regexps, hence hard to read and maintain. The same thing goes for any code that uses a lot of regexps.

Actual Perl code, disregarding regexp, certainly isn't anymore difficult to comprehend than code in most other languages.


Python may be JIT now (is it?), but it certainly wasn’t back then.


JIT may not be the correct thing to call it. At least in my head, any script that doesn't need to be compiled by the dev or the user before running is JIT. It's the scripting vs building difference to me. If that's not correct, then I'd love to be corrected.

Here's my reference:

https://en.wikipedia.org/wiki/Just-in-time_compilation


Interpreter vs. Compiler might be closer to the distinction you are looking for:

https://en.wikipedia.org/wiki/Interpreter_(computing)


Thanks. "My interpreter doesn't care about white spaces, why should I" being something that should have clued me in.


JIT is a compilation model for interpreted languages, which translates bytecode into machine code in runtime to obtain better performance. The opposite of JIT is AOT (Ahead of Time). JIT languages go through two compilation steps: one to bytecode, and another to native code in runtime. Java is one example of a JIT compiled interpreted language, that also does ahead of time compilation to bytecode, while python compiles to bytecode transparently, but does not do any JIT compilation to native, which is part of the reason it's considered a "slow language" (though this expression doesn't really make any sense, as slowness is a property of the implementation, not of the language).

TLDR:

Java uses AOT bytecode compilation + JIT compilation

Python uses import-time bytecode compilation + slow interpreter


> (though this expression doesn't really make any sense, as slowness is a property of the implementation, not of the language)

Yes, which is why this is the way CPython works, but PyPy uses JIT and is faster.


> and is faster.

After a very long warmup time, which may make one-shot applications much slower.


> There should be one-- and preferably only one --obvious way to do it.

There’s not even one obvious flavour of Python to use.


It's hard for me to name a problem domain that doesn't have 3 different python packages doing the same thing three different ways, each more clever and less understandable than the last.


for real

If it's not on the stdlib, I'd go with the most popular one, or just DIY

No, it is not "best practice" to use a barely tested module with half the bugs affecting its main functionality just because someone thinks it is

Unfortunately package abandonment is a thing and they don't even bother fixing needed bugs (which, coupled with 'the urge to deprecate' makes things annoying)


Why TF would you name an HTML parser BeautifulSoup?


I Googled it, apparently it comes from this https://en.m.wikipedia.org/wiki/Tag_soup


OK, I get it, but when I'm trying to write automation for a shop that isn't natively Python-savvy (long story short, it made sense when I started a side project which evolved), if I use that library and ever move on, I now have to document in comments or somewhere WTF "BeautifulSoup" means. Because of some rando's inside joke they thought was funny.


This is actually one reason I prefer the Ruby ecosystem — whimsy is still welcome. The world has enough Noun Manager projects.


When I'm trying to figure out how to build a side project for my team and simultaneously fighting company bureaucracy who reflexively stonewalls things on the order of "no one's done that before, we need permission," I don't also want to deal with some immature idiot's stupid namespace. I want a damned Noun Manager for my own sanity's sake.


The name is also a reference to Alice in Wonderland.


Only one of the items in your "such as" list is unique to python's philosophy, and that is "there should be one way to do it". Ruby embraces the many ways to accomplish something, it's true. However, the three others are not unique to python. Ruby especially embraces "readability counts". And I can't think of anything in the ruby language itself that is implicit over explicit. Perhaps you are thinking of rails and comparing that to python.


For me, the Ruby community's comfort with monkey patching was a big turn off. In Python, you can hack around on class and replace its methods, but if you do, it's expected that your coworkers might stick you in a dunk tank and sell tickets. It's just not the thing that's done.


Monkeypatching is maybe more of a Rails thing than Ruby. I think the biggest problems with Ruby around the time Python began to take off were slowness (MRI was a disaster, slow, and full of memory leaks), plus patchy documentation (lots of things were in Japanese).

Still, I preferred and prefer Ruby. Python has fantastic libraries, but it is a mediocre language. Ruby feels like a simpler version of Perl + Smalltalk, and it is a joy to use. Python has intentionally crippled anonymous functions plus syntactic whitespace, which often leads to long and ugly code.

I think it is a shame Guido hated functional programming and he did not embrace an Algol-like syntax with begin/do end blocks. Those two things could have vastly improved Python. Ruby's block, procedure and lambda design is a stroke of genius that yields beautiful code and makes DSLs trivial.


>Ruby's block, procedure and lambda design is a stroke of genius

Hardly. Not only was it nothing great, it had many variations, making it harder to remember.


    For me, the Ruby community's comfort with monkey patching was a big turn off
If it helps, the Ruby community really soured on monkeypatching in general quite a while back. You don't see it much these days and every decent book or guide warns against it.

It was definitely a crazier place 10+ years ago in the early Rails days.


>It was definitely a crazier place 10+ years ago in the early Rails days.

Yeah! It sure was.

I was around, was an early Ruby and Rails adopter, and worked on a few commercial Rails projects at that time.

That's how I know about the monkey patching issues. Faced them in real life.


I certainly wouldn't blame a person for experiencing those crazier days and thinking the community didn't have the greatest technical vision.

I played around with Rails and Ruby when Rails first blew up. But I didn't start doing it fulltime professionally until 2014. By the time it seemed to me that the community was maturing. I think it's in a good place now.


I figure if Django built an entire framework around metaprogramming, I'm going to do it to Django model objects as a form of vengeance.


I want to see the malicious wonders you can create!


An easy way to start taking the fight back to the enemy is by overriding __new__ and tinkering with things before/after Django's metaclass __new__ does its own metafoolery.

Also overriding mro() is a fun way to monkey with the inner workings.


Whereas in Ruby you get to go to a conf for doing that.



> And I can't think of anything in the ruby language itself that is implicit over explicit

This is surprising to me. I love ruby, but it embraces implicitness more than any other language I've used. method_missing, monkey patching internals, and other types of metaprogramming make things very implicit at the interface level


If you want to really experience the true horror of implicit try Scala.


Scala's implicits are a lot less unpleasant because the type system means that you (and, perhaps more importantly, your IDE) know exactly where they're happening.


> Ruby embraces the many ways to accomplish something, it's true.

Far less true than with Perl.

In another hand, many questions I asked about Python came with various way to do the stuff. So I'm not sure about that advertisement.


Ruby has implicit return values for methods, but unless I'm wrong, so does Python.


Python has a default None return, Ruby returns the value of the last expression.

Neither (except maybe the None case in Python) is really implicit, Ruby is just an expression oriented language while Python is statement-oriented.

OTOH, in Ruby the keyword “return” is superfluous except for altering control flow for an early return, while in Python it is the mechanism for supplying a return value.


> Python has a default None return, Ruby returns the value of the last expression.

I would really hate this feature in a language without strong static typing.


Doesn’t seem like that big of a deal. You’re going to have a really hard time getting your first test to pass if you make a mistake.

I don’t mean using testing as a poor man’s type system, I mean even the tests you would write in a statically typed language will quickly uncover any mistakes you could possibly make there.


In statically typed systems, this wouldn't even compile.

I'm pretty sure I'd see multiple cases where the function would return different types depending on a code path.


> In statically typed systems, this wouldn't even compile.

Still, it's not like you're going to miss that a function doesn't return something expected. Even in statically typed languages you are going to have tests to validate that much.

> I'm pretty sure I'd see multiple cases where the function would return different types depending on a code path.

That's another problem entirely. Although even then it would be pretty hard for an errant type to not blow up your tests. Again, not tests specifically meant to check that your types or sane, just the same tests you would write in a statically typed language.


> I'm pretty sure I'd see multiple cases where the function would return different types depending on a code path.

Quite commonly by design, or, “why union (and sum) types are a thing”.

Something returning type “Foo | nil” is very often going to return Foo on one branch and nil on another.


Python's only implicit return value is the default `None`.


Thanks! Adding that to the things I've learned (or unforgotten, I guess is more accurate)


Well, also, if you're going to write something like Numpy, Python is the most hospitable language for it. C extensions really are a superpower of the language. It would not be easy to achieve a similarly effective result while working through a more standard-issue foreign function interface.


> There should be one-- and preferably only one --obvious way to do it.

It looks too late for 3.13[0]

Maybe they can channel the BDFL in the Packaging[1] thread for version Pi.

[0] https://docs.python.org/3.13/whatsnew/3.13.html

[1] https://discuss.python.org/c/packaging/14


> I think Python was popular as a general-purpose language first.

Is that true? My understanding was that it was a scripting language first (still is), but then got taken up by data and science people and various other niches. Then some education books and courses, like the Artificial Intelligence: A Modern Approach and later MIT and other university adopters. And all of that began to snowball where Python was either the only or main language people knew, so they started using it for everything indiscriminately.


> I think Python was popular as a general-purpose language first. After all, there was a reason people put so much effort into writing Numpy in the first place.

what general purpose? that's just a buzz word. especially back then shipping python apps was never a viable option compared to binaries compiled from c/++ or java. there never was such a "general purpose".


I've used both ruby and python extensively, and I don't actually consider them to be any different, really, on any of these points.


> I think Python was popular as a general-purpose language first

What I understand is that Python was popular in the anglosphere as a general purpose language first, Ruby was somewhat popular in Japan earlier as a general purpose language but didn't become popular in the anglosphere until Rails.


I was actually surprised when I found out they made a web framework in "the RPG Maker XP language".


> Readability counts.

In what way is numpy readable?


It's designed to be familiar to people who know Matlab. Matplotlib bears the same burden. Sometimes you have to start with something that's familiar to users and slowly change when you have people on board.


LOL yeah, while the principles listed in the Zen are good advices and practices, unfortunately most of them don't really depend on the language its self but rather on whoever writes the code and their implementation choices. Of those few that depend on the language too bad python violates pretty much all of them. Starting with the very 1st and 2nd line, where significant white spaces (one of the worst idea ever in programming, if you ask me) make block limits implicit in the indentation rather than explicit int he code with clear delimitation marks (brackets, for instance), making in return most of the source code written in it look quite ugly, and as dyslexic I can add not very accessible. Let alone the horrible experience while iterating on a piece of code: 99% of the time you're briefly stopped by and error simply because in the iteration some lines got commented out and now the damn thing throws a fit for the indentation. Not really practical in my experience.

The reason why one language is more used then others at any given times it's way simpler and more bound to humans than the languages them self: - fashion trendes - laziness - sloth

Most of the people out there writing code and "increasing numbers for any given language" have no real idea of why they started with one language rather then some other one, they never really dig deep enough to actually made an informed choice, and most will keep using a single programming language because they "don't feel the need to learn a new one", aka: I'm too lazy to ever go deep enough the only language I know, let alone learning a new one. And it's the market's fault: we spent the last decade or more taunting how many bagilions programmers will be needed, how anyone can get a great life by simply learning a bit how to code, etc. None gave a fuck about quality, the only goal being cheapening and cheapening the Software Developer profession, until neural networks came about and indirectly revealed the truth: we haven't being rising SW developers/engineers/etc, most of them were just Code Typist copying out of stack overflow. If something like copilot or chatGPT can substitute them, it means there wasn't much value there in the 1st place. In 2007, Jeff Atwood made the quote that was popularly referred to as Atwood's Law: “Any application that can be written in JavaScript, will eventually be written in JavaScript.”, and that's NOT a good thing, it's just the epitome of the state of the industry.

In python's case it's luck was google: python (like go, for instance) is a convenient language for system automations, let's say a more sane versions of what perl was mostly used for in the past (if you notice, lots of python Zen's ideas are attempts to fix perl's insanity). Google has lots of system engineering going on, lots of people using (and abusing) python, and a single repo where everything ends up into, and when they started making neural networks with them, python got fashion for making neural networks. Anyone and their dog wanting to try out some kind of machine learning (10+ years ago) would find a tutorial in python, and tensorflow sealed the deal.

Yes, numpy and pandas did have quite a bit of weight into luring the Math Community into using python, but there's nothing inherent in python that makes them possible, they could have being made in any other language. For instance haskel and lisp are way more approachable from a math stand point, they're just not in fashion any more


No, that’s self-important bullshit. These are just evidence that Python really suffers from people in that community not using any other languages.


Numpy is certainly amazing, but there are tons of competitors in the data/scientific space, which pure "A-type" data scientists tend to prefer: R, SSPS, Matlab...

The difference is that Python doesn't entirely suck as a general-purpose language. Sure, you might have better options, but it's still reasonable to write almost anything in Python.

Other scripting languages like Ruby, JS and Lua are probably a little bit better when evaluated strictly on their merits as programming languages, but they lack the ecosystem.

In other words, it might be inelegant, slow, a mess, whatever, but the concept behind it is basically correct.


>> R, SSPS, Matlab...

I'd immediately throw out "competitors" that cost hundreds of dollars to use. https://www.mathworks.com/pricing-licensing.html Academic pricing is $275/yr and normal is almost a thousand a year. Then you pay for modules on top.


Yes, and this is why R has become so dominant in the data and statistics spaces. R is free, and has gained so much momentum that it often has packages that no other language has. Yesterday I found out that there was just no way to easily reproduce mgcv::gam in python. There are loads of similar examples.

As someone who likes both R and Python, and had the misfortune to learn Stata; I am very glad to see R winning over these expensive dinosaurs. Maybe one day it will even displace matlab!


R even has packages to read in NanoSIMS data! [1]

There are perhaps a hundred NanoSIMS machines in the world as they're extremely expensive [2]. Yet we have an R package on CRAN to analyse their data!

[1] https://cran.r-project.org/web/packages/lans2r/vignettes/lan... [2] This 2017 reviews says there are 42 machines: https://www.osti.gov/servlets/purl/1398172


numpy exists (somewhat indirectly) because matlab cost $$$. and now numpy is a far larger use of matlab's concepts than matlab will ever be.

I haven't seen anything an "A-type" data scientist needs in R/SAS/SPSS that was intrinsic to the language and couldn't be ported to Python. I don't want a "data/scientific" language, or a "web" language, or a "UI" language- I want one language that explicitly supports all the usecases well-enough.


> I haven't seen anything an "A-type" data scientist needs in R/SAS/SPSS that was intrinsic to the language

Not intrinsic to the language but in terms of the ecosystem, for a purely A-type data science workload R is significantly better. dplyr + ggplot vs. pandas + matplotlib isn't even remotely close.

Now, obviously in the real world nothing is ever purely model work, which is why Python more than makes up for the difference.

> I don't want a "data/scientific" language, or a "web" language, or a "UI" language- I want one language that explicitly supports all the usecases well-enough.

Me neither. *taps sign*

>> The difference is that Python doesn't entirely suck as a general-purpose language.


R is great when your data is already together in one place and homogenous/processed. In that instance the workflow using dplyr/ggplot is better. R the language is hot garbage though, and the libraries available for doing random things that aren't working with certain data structures, doing math or plotting are generally much worse than their python equivalents. That is the reason python dominates data engineering, and why it's pushing R out everywhere but academia.


R will still have a solid place in industrial pharma for a long time (even as python growths, R growth will continue. R is pretty tightly intertwined with the process of processing and submitting data from clinical trials to the FDA, much to SAS's chagrin.

Personally I think we have to accept that R, Python, Java, C++, and Go are going to be with us for the rest of our lives. I would expect PHP, Ruby, and Perl to go away much faster. Rust is still in question.


Are you from the future? R is making progress to replacing SAS in pharma, but has a long ways to go. Here is an article from last year[0] patting themselves on the back for making a R submission to the FDA. There are oodles of existing processes where data has to be formatted just so because that is how SAS does it. Nobody wants to rip up and re-validate that code until they must.

[0] https://www.r-consortium.org/blog/2022/03/16/update-successf...


The trope about SAS being required is pretty old... people trotted that out when I last worked in pharma ~15 years ago. The FDA specifically released an article saying that R is perfectly acceptable for submission https://blog.revolutionanalytics.com/2012/06/fda-r-ok.html

Please don't spread the SAS FUD. I work in Pharma and talk to the statisticians all the time.


I currently work in pharma, and I can point to teams of SAS programmers. There are definitely R efforts, and I have no doubt it is the eventual future, but it is not yet here.


Rails is probably going to keep Ruby relevant for a lot longer than Perl. I might go as far as to predict that Ruby will be the new COBOL or FORTRAN - widely used but neither cheap nor easy to find someone to modify it.


If there is anything in R that is better, it could be ported. It's kind of different from language intrinsics that can't be easily ported. In fact I think there's a great argument for writing a standardized data processing and visual representation layer in C++, and then making it usable from all the languages. It would be nice if this was true for the notebook layer.


It's the domain knowledge that's difficult. Like the example up thread about the lack of generalised additive models. The version in R was written originally by the developers of the method. There's no Python version because no one with sufficient expertise has ported it.

Don't get me wrong, I know mostly write Python because it's a better general purpose language but there are big big tradeoffs.


Matlab has a more interesting dispatch model than python though. (Mathematica has even better dispatch but I haven't used it in almost 20 years at this point) Julia's model is even better, but I just never have time to dig into it deeply enough to get to the zen of Julia. One day when I have time to understand Julia I'd like to, but it's hard to compete with the reliability of python just always working.


What is dispatch in this context? Function call polymorphism?

Aside: I wasn't a big fan of the Mathematica language so I wrote "PyML" a looooong time ago- it turned unevaluated Python expressions into Mathematica, transmitted them via MathLink to be evaluated the Mathematica kernel, and then the results were send back and converted to a Python expression. This was long before sympy. It never went anywhere (I found ways to simply not depend on Mathematica).


Yes, function call polymorphism. Especially combined with Mathematica's piecewise syntax. I didn't really use it for numerics, it was more for Lisp/Prolog type things. I was always impressed at how short and readable the Mathematica code ended up being. I used to use Mathematica quite a bit to generate C++ functions.

But it's a very different use case than Matlab. One of the things I like about python is it isn't too bad for sliding between lispy/functional modes for some things and more lapacky things elsewhere. Matlab has a better syntax for the lapack parts but trying to do anything functional gets really annoying and very, very slow. But still I would say Mathematica has the best pure math syntax for those sorts of things. I don't have any experience with sympy, I should probably check it out.


Mathematica is nothing less than a work of art. I dedicated a fair amount of time to become proficient, but it never solved the use case I had (find derivatives for complicated functions- this was before autodiff was common).

Unfortunately, Wolfram himself has made a number of decisions that greatly limited its applicability and so it will always be niche.


> and couldn't be ported to Python

“Couldn’t be” is not the same as “has been”. If R has a package that Python doesn’t, I’m not going to port it to Python, I’m just going to use R.


>I haven't seen anything an "A-type" data scientist needs in R/SAS/SPSS that was intrinsic to the language and couldn't be ported to Python.

I don't know what "A-Type" is so I might be misunderstanding you.

For SAS at least in my Org what keeps it entrenched is Enterprise Guide and the associated environment. Having a drag and drop GUI to do the data extraction and manipulation and the analytical tasks makes it very easy for non experts to be incredibly productive. The people that use it here are Engineers (Non-software variety) and Finance accountant types. These people would not be productive writing python code (or R) but still need to use something heavier then excel for data analysis.

Over the years we have chipped away at parts of SAS with things like Power BI and Azure ML Studio but I don't see python playing in the same space.


Well, why didn't the free alternatives to Matlab like Octave or Scilab win out then ?


Same reason people use Excel and not LibreOffice. Until you can promise 100% compatibility, users are going to run into something that breaks vs the original platform. For real business processes, the licensing cost is not worth sweating about the problem for parity.


I think people were not ready to switch from some heavy established software. Also by that time, costless was seen as less qualitative (certainly because organisations are happy with licences and are always looking for support contract ?) Coming later, when the market(?) is ready, made R more successful.

I think people used to compare to established standard and detractors are always there to point out missing stuffs, giving impression that Octave or Scilab are behind. Coming with a radical different language, R was condemned to quickly succeed or perish, and anyway avoid direct and frontal comparisons.


But R is itself a reimplementation of S.


Anyone who used Matlab to process text before 2015 wouldn’t consider it anywhere closer as a competitor to Python

Same goes for R/SASS

Python has no competitor in terms of the whole package, it is a full featured language while being particularly good at data


This is a good point.

I've done a fair bit of R and there is no doubt that as a pure data science tool it's usually quicker to produce something than Python.

But I prefer Python because inevitably any data science project ends up having other "bits" that aren't pure data science, and Python's super string general purpose computing libraries are hard to beat.


Some industries are stuck on matlab (jet propulsion, for example) but in general people from sciences are either Python or R, and SAS/SPSS are only really used in places like health care or pharma where there's a lot of regulation in place, because the software is designed around that regulation, unlike Python and R.


It is also still quite dominant in neuroscientific fields, as there is a lot of software and code there from the past, but it is changing little by little.


>Python doesn't entirely suck as a general-purpose language.

I always get caught off guard by comments like these. In my mind Python doesn't at all suck as a general purpose language. Only real argument is execution speed, but most people aren't actually writing code where Python's "slowness" matters and if you actually need that speed you can write that part in C and still use it from your Python application.


Matlab costs $$. Matlab was ahead in perf for a long time, not sure now. E.g. they used to JIT code, etc.

Comparing python to R, python feels like a much more mature language. R always seemed only useful for small scale projects, e.g. writing some script to process data with < 20 lines of code. Python codebases just scale much better.


It also has the nice feature of being good at multiple things. If you get a model running in Python, and decide you want to be able to run the code via an API, put it behind Flask and ta-da!, now it's a network service.

It's not the best way to write analysis code, and it's not the best way to write a web service, but it's probably the best way to write analysis code that runs on a web service.

I picked web services as a random example, not one I'm particularly passionate about. But as a general trend, if you want to combine 2 unrelated domains, Python probably has you covered better or more easily than just about any other language.


> If you get a model running in Python, and decide you want to be able to run the code via an API, put it behind Flask and ta-da!, now it's a network service.

You could say the same about a model (or any function) running in R. Want to instantly convert it to a web service? Use plumbr or RestRserve packages for that. Want to make a quick web application around it? Use the shiny package for that.

Just because Python is good at X, doesn't mean other languages are necessarily worse at it.


I would just add that for scientific scripting Julia is usually nicer to program in than the combo of numpy and Python. Julia is fast enough on its own that you can do the matrix calculations directly and it supports real mathematic matrix operators. It’s also a great type based design that makes using packages magical.


I think Julia has a hard uphill battle against the "Python is the second best language for anything" effect. Julia looks pretty cool to me, but I already know Python. I'll probably learn more about it in my spare time, but I'm also never going to be able to convince my coworkers to try it out, because they also know Python and that's good enough for them. And thus Python's dominance continues.


I was able to convince my coworkers to stop working on a Python project (a simulator for a scientific instrument) once we all realised that we would have not been able to keep using plain Python because of performance. When the alternatives where (1) to force everybody to learn either C or Fortran, or (2) to rewrite everything in Julia, they asked me to provide examples of both solutions. After I showed them a few different approaches (pybind, cbind, f2py...), there was full consensus for Julia. We moved what we implemented so far (it was not to much, luckily), and so far nobody has regretted the choice.

The problem in using two languages is that you do not have just to learn a new language besides Python. You have also to figure out how to properly interface them. (f2py was considered the easiest solution, but we feared that the 1- vs 0-based array indexing would have likely caused many bugs.)


Did you evaluate Cython? I'm not anti-Julia, but I like that my Cython code is useable out of the box from Python, with no wrapping, and then users can continue to use their Jupiter + Python scripting workflows with performant bespoke modules complemented by the full Python ecosystem.

Someday I'll do a project in Julia. But for some such projects, Rust seems fully guaranteed to be performant while Julia might or might not be, so I might still lean towards Rust (unless one of the high quality packages of Julia removes a lot of development time, which is a decent possibility).


I used it for another project but was not impressed. When I tested it, the documentation was scarce, and the deployment is harder than Julia because it needs a working C compiler alongside the CPython interpreter.


Very true, it isn’t usually the best designed language that wins in these cases. The language just needs to be good enough like CPP and JS were. Python is nicer to write than either and with [mojo](https://www.modular.com/mojo) possibly rising in popularity, Julia will lose some of its benefits. The momentum is behind Python which is the ultimate factor.


> which pure "A-type" data scientists tend to prefer: R, SSPS, Matlab

I very much disagree and I would say it's the opposite. The only competitor for data-scientists is R, especially if you're doing stats-heavy analysis (opposed to an ML-heavy analysis). SSPS is in my experience used mainly by someone without a data-scientist background, e.g. psychology, similarly to matlab, where the only users I know are engineers.

I would bet that the overwhelming amount of data-scientists use python, even if it's just to set up the data-pipeline. It's just the better general programming language and data-scientist are not analysing all the time but have to get the data in shape and available first.


I agree with these as major points. A few other secondary ones

Ruby was primarily maintained in Japanese, so had a barrier to entry for language level issues. It also lacked english-language evangelists and university presence.

When Ruby was new (invented 1995) Python had some older design issues (as it was 6 years older) however it really recovered and implemented a lot of change through Python 2 (2000) and python 3 (2008). Though there were compatability issues in the short term in the long term this change worked out.

Ruby inherited from perl TIMTOWTDI (there is more than one way to do it) philosophy which is a little more at odds with the scientific community


> Ruby inherited from perl TIMTOWTDI (there is more than one way to do it) philosophy which is a little more at odds with the scientific community

It's at odds with humanity, TBH.

I once spent like two days trying to figure out how a small (couple hundred lines) ruby script worked, because it overrided `method_missing`.


Wasn’t python released in 1991? That would make it 4 years old when ruby is released.


To add to your comment, the web space was also already very crowded, with PHP, C#/ASP, whatever Java people were using, Django, etc. Rails has always been fairly niche compared to giants like PHP.


People like to hate on PHP (and to be honest I really never enjoyed it), but there was a time when if you wanted to build powerful, large scale websites (or hosting, or scalable web farm deployments) it was the most reliable and performant thing short of Java.

It also helped that the PHP ecosystem had some pretty solid and battle tested HTTP components and pretty productive frameworks--server-side rendering with Varnish was as fast as a CDN-backed site feels today.


The things PHP had that made it win (for a while) was that it was easy to set up and configure for a hosting environment (Apache plus modphp worked out of the box consistently) and the HTML first design made it easy to add to static web pages, whereas forms and CGI was multiple steps and more confusing for less experienced devs.


I'd qualify that. A lot of the hype around PHP being easy for cheap web hosting misses the point that a lot of that cheap hosting was configured with PHP as a CGI module, not mod_php. In that sense it was on a level playing field with Perl.


Agreed. The one thing PHP demonstrably did better than other langs (other than be a simpler Perl) was be web-first.

Other langs assume you could be doing anything, and here's the extra steps necessary to make it serve pages. PHP assumes the reverse.


> Python ended up 'specializing' in data contexts, thanks to Numpy / Pandas

Yeah it was a success 20 years in the making, had a fair amount of input from science men and that being GvR’s background he responded positively e.g. the extended slices (striding) and ability to index with an unparenthesised tuple (multidimensional indexing) were very much added for the convenience of numpy’s ancestors.

I would not say it had no competition, Perl used to be quite popular in at least some sci-comp fields (I want to say bioinformatics but I’m probably wrong).


Why probably wrong? Bioperl was very big in the bioinformatics world.


> Why probably wrong?

Because I don’t have a great memory and I was never into Perl so I was extremely unsure of the suggestion and didn’t want it to come across as an assertion.


James Tisdall and O'Reilly produced "Beginning Perl for Bioinformatics" which sold like hotcakes at Perl conferences.


This is part of it, but the reasons go back further. As others have mentioned, Python is a pretty old language. It debuted as "Python" in 1991 even before Linux was out, and it existed as ABC even in the late 80s. It had a very good, easy to use C API early on. Combining that with the way it can load shared object files as modules and the ability to create user-defined operators and effectively extend the default syntax allowed it to mimic both MATLAB with SciPy and Numpy and R's data.frame with pandas, which is a huge part of why it attracted all the people from scientific computing and data analytics. The API to the major packages was nearly identical to what they were already familiar with from before. It's not an array language on its own, but the ability redefine operators and the fact that array indexing is itself an operator made it possible to turn it into an array language. The fact that all of these packages can easily just wrap existing BLAS and LAPACK implementations made it very fast for these purposes, too.

That also happened pretty early. Numpy came out in the 90s. But even before that, Python and Perl were about the only two scripting languages other than the shell that was guaranteed to be present on a GNU/Linux system from the start. That made it really popular to create high-level system utilities in Python. A whole lot of the util-linux packages, libvirt, the apt packaging system, all make heavy use of Python. So it's not just the academics already familiar with it, but system administrators and hackers, too.

It also gained widespread popularity as a teaching language alternative to Java. Once MIT started using it in their intro to programming course and put that on the Internet via OCW, it really took off as many people's first exposure to programming.

The batteries included approach to the standard library makes it very usable for one-off automation tasks, too. I don't even like Python that much, but the other day I was doing some math work and just needed to compute a bunch of binomials and enumerate subset combinations. You want to look up how to do that in your language of choice? Python just has math.factorial and itertools.combinations in the standard library. If you're using Linux or Mac, you already have it. It may not be a great choice for application development, but if you need to do some quick, interactive tasks that are never going to get deployed or even necessarily stored on disk as files but just run once from the repl, and it's too much for your calculator app or Excel to handle, Python is perfect.


Not only math/academia, but nearly everywhere. As python started coming pre-installed on MacOS and was a lot easier than fumbling with shell scripts, a lot of scripts were being written to automate boring stuff. Also, the hidden beast is Django and Flask, which are also being commonly used by both serious and non-serious folks, overall increasing the userbase.

Also python is less verbose(compared to Java, C#, Scala, JavaScript pre-ES6 etc.) in terms of syntax, so it is much easier to learn and can do both OOP and Functional, while being easier to adapt(no compiler, easy installation, lots of popular libraries), hence academia picked it up. Also, python has built-in parallel processing features which are less daunting(in terms of concepts and syntax) than other competitors(C++/Java etc).

See the popularity of Go, same thing happened with Python.


I think you're describing a symptom (popularity in data contexts) rather than a cause. If you're looking for causes, https://jeffknupp.com/blog/2017/09/15/python-is-the-fastest-... makes a good case the answer is PEP 3118.


There is a lot more history before the data part. But yes, it is a big "modern" driver for its popularity, and Python owes it to its easy C interop.

I would contest the Ruby as web dev part, though. I used Ruby for Puppet and Vagrant a long time ago and it was great, but all my experiences with Rails (and associated gems) turned into maintenance nightmares.


As a Python programmer in the 2.x days, I think data science happened fortuitously to counteract the exodus over the 2->3 transition. For a long time, it felt to me like python was on the way out because of the changes.


At the same time a lot of undergrad programmers were being introduced to Python 3. I think that’s probably the single biggest factor in the Pythons success.


It’s not because of data science, it’s popular for the same reason Java is popular, it’s what they teach in schools. It’s a virtuous cycle, industry adoption grows school adoption which grows industry adoption…


Being taught in schools is helpful, but not sufficient on its own, or we'd see more professional Forth, Prolog and Scheme programmers


future headlines:

Scratch for Cancer Diagnosis.

NASA adopts no-code GenAI interface for critical navigation and telemetry on first human flight to Mars.


> In that space, it had no competitors.

Well, there's R.

But basically, yes that's the long and short of it.


Or Julia, a language custom designed for data science.

Also, MatLab was the go-to before Python took over the scene. So it definetely has competitors.

Python's strength is that it is the "jack of all trades" language. Its not the best at anything but its pretty good at everything.


Julia is much younger.

Matlab is closed source. I haven’t touched it since I had a uni license.


Today I learnt I'm fucking old. My first web programming exposure was Perl CGI scripts



This fine day, we are all old together.


Me too.. learned it in college.. use CGI;

It was fun.


#metoo, 1993.


That's a fascinating wrong way to look at history. Python has no specializing. It always played strong on many popular fields.

From the early days, it had a strong leaning to science, after all that's where it came from. But at the same time it also had a strong stance in unix-areas and established itself as a good tool for sysadmins, competing with perl and bash. And at the same time it also had strong bindings with other languages, gave it early on the fame of a glue-language. It also dabbled early in Web-Stack and networks and had to many web-stack-option to gain that "one framework"-fame, like ruby on rails.

Python just was the language which allowed you to play on on all the important fields, with simplicity, even with some level of speed. This was something other languages lacked. They either were complicated, or slow, or lacking support for specific areas. Python just somehow acquired them all, and had a big impact on a broad area, which made it so popular, because for the majority of usecases it was a natural option.


Good answer.

Python is, imho, not a very good language. But it has a few great libraries which led to overwhelming momentum and inertia. In a cruel twist of irony those libraries are probably written in C.


I'll argue that python is very alive in the system management/scripting space.

Because of the library support (especially for data) it's also popular as api servers. It's not as popular for entire applications but is still very active.

EDIT: Django is actually about equal or more popular then nodejs based on google trends. Not sure that means anything as idk how many people google "nodejs" when working on it.


I think HTMX has given a new lease of life to old school, server-side frameworks like Django and Flask.


> Python ended up 'specializing' in data contexts

There's a big sub-continent of data stuff in the Python ecosystem, but Python the language hasn't specialized there I'd argue - it has enabled the 3rd party number crunching and data libraries like NumPy and Pandas, but not specialized over other domains where Python is used.


But still slower and more energy consuming than Javascript or Java unless you use it like a wrapper of C/C++ libraries:

https://stratoflow.com/efficient-and-environment-friendly-pr...


It is easy to underestimate how much Python as a language benefited from nympy, scipy, pandas and scikit-learn


Python doesn’t compare to R for analytical work — it doesn’t have the eco-system and R is specialised for it.

Where Python broke in was machine learning which is not analytical work and often involves lots of general purpose programming. sklearn, skimage, opencv, Tensorflow, Torch, JAX, and so on are often used in a general code base. Torch was actually a C++/Lua framework initially before switching to C++/Python.

Python has a dominant general purpose eco-system. It’s also a simple language to learn compared to R/Matlab/etc which are just horrible to use for data structures or paradigms other than vector/matrix based pipelines.


I think that data science & web competition is at least 80% but I'm going to throw out a few possible ideas that I haven't seen mentioned that may have helped. Maybe I'm way off base though.

* Udacity's popularity. It's first few free courses were Python. I believe they advertised as a way to get into Google. Not sure how many people got into Google via Udacity though.

* Leet Code Job Interviews. Maybe I'm way off here as I don't do these but from what I've read people who spend a lot of time on these prefer Python because the syntax & libraries often give a speed & simplicity advantage.

* Lots of devs had to install Python to use some utility or tool someone else created.


Good points.


Maybe it all came down to the decision to fuse python's Numeric and Numarray libraries into Numpy, providing a unified "fast" Matlab-like library, which ended up being a great fit to a lot of datascience use cases.


Might be wrong but I think it started with Jupyter/MatplotLib + Python's permissive license. Heard it started all with a physics professor unable to use Matlab for teaching.


Python also had a near death experience with the Python 2 to Python 3 transition, it could have wound up like Perl did with Perl 6 but I think the data science use case really made it.

Overall I think Python is the best language for the non-professional programmer who wants to script things and put their skills on wheels.


Python specialized in ease of learning and running the code. Everything else came after as a byproduct of the large user base.

One reason why I switched our team from Matlab to Python was the capabilities of the core libraries. Argparse alone sold it for us.


And this was all despite Python 3 which was, imo, the worst software migration in history


The worst one that succeeded.


Bingo! Then you have 2-3 year experience junior devs join your company because they know react and are completely blank stare faced when you ask them how they are serving their data.


It actually does have a competitor, R. R has a superior ecosystem for quite a few data tasks. Mainly in the analysis/statistics space.

However aside from that R is vastly inferior.


That was Matlab. Python was the substitute


Ruby is also kind of horrible.


Numpy and Pandas existed as they did because Python is written in C and cooperates so well with C, so I believe that would be the better answer.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: