Question inspired by the recent discussion (https://news.ycombinator.com/item?id=29354474) that mentioned how Python wasn't a big deal before it had a data science / ML ecosystem. Right now in addition to data science there's also web backend stuff with Django, Flask, FatAPI. What else are people doing with it now? What about 20 years ago?
People who were using Python back then, what was your experience?
Going back in time to 1994, we had a molecular dynamics visualization tool called VMD. It had a home-grown scripting language. I had ideas for a new programming language designed for molecular modeling (because "tiny languages" was the Unix philosophy I had picked up). After looking around, I realized 1) I didn't know how to really make a new domain-specific programming language, and 2) there were existing programming languages that could be embedded in an application, and extended with domain-specific features.
I choose Tcl, because it was a very similar, but better, than our home-grown scripting language. And it was easy for people to learn. (Yes, easier than Python.) I considered Python, but was thrown off by the whitespace. And Perl wasn't a good fit. (Perl 4 C extensions were hard to do).
Instead, another visualization package, UCSF's Chimera, was the first to have Python integration. In ... the mid-1990s?
In 1997 I was doing more Perl work for bioinformatics. I had to read the advanced Perl book to figure out Perl5 objects. I wrote the equivalent program in Python, and it was just so much easier, even for a neophyte Python programmer. But bioinformatics at the time was Perl-oriented (see "How Perl Saved the Human Genome Project" - the author, Lincoln Stein, also wrote CGI.pm and developed ... OraPerl was it? One of the Perl4 branches with RDBMS support), and a Python product in that market was a no-go.
In 1998 I had the chance in a greenfield project to start with Python. Like I said, we used a C library, the Daylight toolkit, which has an object-like data model exposed as opaque integer handles. Python's support for C extensions was outstanding, Dave Beazley had already shipped SWIG, which simplified the process, and Roger Critchlow had developed "DaySWIG" to generate binding for Tcl, Python, and Perl.
But that meant programming at the C level, including garbage collection. Something like 20% of the Daylight toolkit calls were to "dt_dealloc()", to have the Daylight toolkit free an "object". Attributes were function based, eg, dt_charge(atom) instead of atom.charge.
My high-level interface replaced manual garbage collection with Python's, which was a perfect fit for Python's reference count system. And I used __getattr__ and __setattr__ to make object properties accessible via attributes.
This quickly became the base for chemistry development in the company because it was so much easier than using the C library API. (FWIW, the company was doing machine learning in the late 1990s. I mostly worked on the parts which integrated the ML tools to molecular structure. https://patents.google.com/patent/US6904423B1/en is a patent showing some of the times of ML done. The part "A more representative feature of the molecules is the maximum common substructure (MCS) that is contained in all of the molecules in a hot spot" uses MCS code I started.)
I still kept in touch with bioinformatics. In 2000 I co-founded (with Jeff Chang) the Biopython project. Python was starting to make in-roads in bioinformatics, mostly in labs that didn't already have a strong Perl presence.
Perl never had a strong presence in biomolecular or pharmaceutical structure software, I think because expressing graph data structures (where nodes are "atom" objects, with atom properties, and edges are "bond" objects, with bond properties) was much header in Perl than Python.
Another company, Combichem, used Python extensively in-house as a replacement for shell scripting. That is, their user base wanted commands which accepted filenames as input and output, rather than passing Python data structures around. They also used Python (backed by wxPython) to build GUI tools. Several people left Combichem and started Rational Design, with a new code based influenced by their experience with Combichem. That code still exists as RDKit ("RD" = "Rational Design").
I started getting consulting jobs because I knew Python and pharma/biotech companies wanted people with Python experience. I worked on a projects for AstraZeneca in Sweden to integrate a large number of molecular property prediction tools into a CGI (!) application. That had to work with command-line executables, Perl programs, R libraries, NumPy analyzes, and more. See https://www.python.org/success-stories/python-for-collaborat... .
So for me, I would say much of pharmaceutical early drug discovery R&D was using Python by 2006. Certainly by 2002 or so I didn't get had to explain what Python was at conferences.
For another talk about Python in the physical sciences in the 1990s, see Dave Beazley's talk (yes, the same Beazley) at https://www.youtube.com/watch?v=4RSht_aV7AU . ("Keynote talk, presented live at PyData Global, October 28, 2021. In this talk, I give an oral history from the early days of scientific Python and describe how Python ended up being used on a Supercomputer at Los Alamos.")