I'm normally joyful when a modern language (say, julia) decides to break backward compatibility to improve the language on a fundamental level. This is mostly because I'm not too vested in it.
For languages where I have 10+ years of work behind, it's the exact opposite, and where I see the c/c++ model of not breaking backward compatibility a much saner choice.
Python in particular is an extremely bittersweet pill to swallow. The amount of small and large breakage I had ever since I started to use it for production work (v2.6 and onward) has been relentless. Small and large breakages requiring constant retooling and reworking. Minor release? Yeah, still breaks just as much as a major one. pyenv doesn't help when you dependencies need to be updated, and the updates do not support the lower versions you wanted to use anyway. Containerizing everything is a not a solution for a project is expected to be supported for years so the real way forward is to fix it, again, and again, and again. My experience is that every 6 months there _will_ be work just to fix bitrotting. A one year old python project will hardly even run unless it's using the stdlib only.
By comparison, I never experienced such churn with perl.
For a different perspective, I started using python about 5 years ago, and I never experienced a single breakage due to a new python release. Instead, I'm always excited to read new version releases notes and it's the only thing that may make me move away from debian stable someday. But waiting a few months, using containers, or compiling a newer python is usually fine when I can't wait.
Removing deprecated stuff is… the point of deprecating stuff?
> Removing deprecated stuff is… the point of deprecating stuff?
The point of deprecating stuff is to redirect to newer/better/saner APIs, not necessarily removing the thing.
This is where I like Java's approach. Stuff is deprecated in the JDK but fairly rarely is it removed. When it is, it's because the feature is either unused or so detrimental to the ecosystem as to warrant removal (see: finalizers).
My experience with Python is probably closer to nicoco's than wakeupcall's (usually pretty painless), but I tend to prefer the Java approach. Waiting three releases for removal gives you four years of support and seven years of security fixes. That isn't unreasonable but feels a little too fast in a language as old as Python (especially with people still burned from Python 3).
Same here. I don't think I've been affected by a single deprecation since upgrading to Python3 in 2015, and even that upgrade wasn't that difficult -- mainly I was just forced to fix a few things I had been doing incorrectly, which imho is a good thing.
5 years is not enough to be experiencing this. The pain is very real and consumes a tremendous amount of effort. I've started writing tools just to create inventories of which packages with which versions I'm using where in order to try to automate the upgrading of virtual environments as much as possible. It's a horrible experience.
The biggest boulder I currently see rolling towards me is the MongoDB driver since I need to upgrade the databases and the drivers can't cope with the latest version.
Other than the 2 to 3 transition, doesn't jive with my experience as a developer who's used Python in various projects for over 20 years (since the 1.5.2 days).
I've dealt with a lot of sideways yak-shaving work in my career and still frequently today and Python has been the absolute least of it.
Can you detail any of these "small and large breakages" or "bit rot"?
It just doesn't jive with my experiences with any large code bases in nearly any language, python included.
Python 3.10 introduced asyncio changes that broke compatibility with earlier 3.x releases. I don't have the details at hand, but as I recall, some code that supported caller-supplied event loops not only broke, but could no longer be made compatible with both old 3.x and new 3.x python versions (unless separate code paths were maintained).
This was frustrating, because I don't expect point-releases to break the standard library, let alone require a code fork to maintain cross-distro compatibility. It was disappointing, because I had never encountered such breakage from python while Guido was still in charge.
I had the same experience. I was pretty annoyed to see that some server apps were unable to run after the upgrade because of some changes to the loop. "Why?", I was asking myself.
Python 3 came out 14 years ago, and changing the syntax of one function wasn’t even a big deal then. Literally a 10 minute code search to fix a project.
It was literally months to fix every change in every project and to fix all the issues coming from that. ("print" wasn't the only change in 2->3)
Our codebase is was (and is) old, but the forced change came recently, when ubuntu decided to remove python 2 support (ubuntu 20.04?). And when ubuntu decides that they want python >=3.12 as the only version installed, this means another change and more work to fix stuff that worked just fine.
It's a real shame, since this is a project management issue.
You'd expect python to be a stable language given its age.
I expected that to be the case at 2.7. Then at 3.1.
At 3.12 it's still not the case.
It's reasonable to expect this is not going to change.
It's gotten far, FAR worse since Guido stopped having as much guidance. The increase in mandatory updates at regular intervals is a recipe for disaster in most cases, as it is in python.
Also, to pile on top of the stdlib problems - it's probably clear to most everyone at this point - but NVIDIA and Google (tensorflow) are really some of the biggest reasons python sucks in this way.
They're the ones that cause most of the breakage. Starting with Nvidia making breaking changes, which then propagates to tensorflow et al, and then to enormous number of packages that depend on these fundamental packages.
A problem with pinning is that it doesn’t only freeze features, ensuring that the features you need will be available forever, but also bugs, in particular security bugs, ensuring that security bugs you have today, but that you and possibly even the entire world are unaware of, will stay around forever.
PyPI is not required to run Python though. You could serve or source those packages elsewhere. Pinning the dependencies would at least resolve compatibility of the actual code.
You can. Package everything you need with the environment and you’re done. Now whether future versions of whatever container you choose to run it in will support establishing the environment is a separate detail. There is no answer. You can go all the way to reserving an entire computer to run mission critical code but you don’t know if it is future proof in terms of electrical specifications.
That’s the point of civilization though. You iterate and what works out, other people copy and iterate even more.
> You can go all the way to reserving an entire computer to run mission critical code but you don’t know if it is future proof in terms of electrical specifications.
See, that's a beautiful example, just not in the direction that I think you intend it. I can, in fact, take an appliance that's a few decades old and plug it into a modern outlet and it'll work just fine. I do not, in fact, have to preserve an entire electrical grid just to run my application^w appliance, because the underlying infrastructure maintains compatibility.
No it is an example exactly in the incorrect way that you took it. Just because electricity has been relatively constant in specification doesn’t mean it’ll always be the case. You can work with that assumption up to even a third or fourth order approximation. But on some level it’ll fail.
Which brings you to the larger generalization that you haven’t grasped: the further down the stack you go, the more stable it is. Python is nearest the top layer. Hardware changes in decade cycles. Electricity likely in century cycles. Keep going. The laws of physics change never. Obv the time scales aren’t strictly logarithmic. Human ingenuity means we could find some exceptional electricity specification tomorrow and convert the whole world in one decade. But it is relatively rarer.
Yah, nothing the GP says makes any sense. I've been working with large code bases with many languages for many years. None are perfect, but the GP makes no sense, sounds like poor decisions or poor code rather than a poor language.
The idea that pinning versions and using environments will make your code run stable forever is very wrong (in python).
You can't even get things to run for a few years this way. Many of the packages are simply not available anymore. Ever tried to get an environment running that uses qt4 on py2.7? It really wasn't that long ago that py2.7 was standard (in the stable code realm that we're talking about).
If you want a long term snapshot of other people's packages, the onus is on you to store them indefinitely. Packages can be pinned and installed from a local folder.
The only language I've experienced so much churn with is Swift (which loves to break things, vs. the relative stability of Objective-C.) But Apple has already outsourced yearly iOS compatibility updates onto every developer, and now they have even decided to remove apps that haven't been updated recently from the app store. A stable game/app development platform it is not.
I am still shocked though by the Python project deciding to break billions of lines of legacy code. What other platforms demonstrate that kind of contempt for their developers?
Removing the print statement seems like a particularly negative trade-off: breaking code in the name of syntactic purity, instead of simply deprecating the print statement and letting it live on compatibly with the print() function.
Since then a bunch of new syntax (not all of which is bad) has of course been added.
It's also weird that until now they seem to have largely ignored performance and focused on adding random new features.
I’ve had that problem with Perl, too, so I think it’s more about which set of third-party packages you depend on and the norms in that space. One interesting tradeoff is that Python’s standard library is pretty large so you have a fair number of programs which can be frictionless by sticking to the stdlib.
This is perfectly exemplified by Python 3's insistence that you call print with parenthesis:
>>> print "hello"
SyntaxError: Missing parentheses in call to 'print'. Did you mean print(...)?
I empathize with the lang devs in that having two forms of print is nonoptimal, but the fact it tells you to do something different while fully understanding what you said (as it were) is what really irritates me. It comes off feeling needlessly pedantic for the language.
I don't believe that the problem was that "having two forms of print is nonoptimal"; I believe the point was instead to remove a parsing ambiguity, which in turn allowed a whole class of errors to be caught at load time that previously weren't; and allowed the syntax to be extended in other new ways that previously couldn't have been encoded as parse rules due to the ambiguity.
While the Python3 lexer is specifically hacked up to recognize 'print' as a distinct lexeme — and to thereby emit an additional parser meta-instruction lexeme that triggers a special error-handling path in the parser if you then go on to make a syntax error per the newer uniform syntax — it doesn't actually know what you were trying to print. (That'd require a successful parse!) If the Python3 parser tried to do the compatible thing, that'd require ditching the uniform syntax altogether, and going back to the ambiguous parser.
It's a bit like Error Correcting Codes — the uniform Python3 parser knows enough to know that you did something wrong, and is provided by the lexer enough context to guess what kind of failure it was; but it doesn't have enough information to "Do What I Mean", because that's a strictly-greater amount of information.
No need to disagree, that's just two different design patterns among programming languages. If stuff like that matters, you can always choose Ruby or something similar. It has its own shortcomings though and neither approach is ideal in every situation. That being said, the transition from Python 2 to 3 was the most horrible thing that ever happened to a popular language. Print for example was fine as a keyword and forcing it into a function after so many years was a terrible retroactive design choice.
Meh, the print change was fairly trivial. You could backport the print function in Python 2 codebases, and updating existing print statements to use the function was easy to automate. String changing from bytes to Unicode was a lot more painful, in my experience.
That was just the surface. Even the Python Foundation knew that their 2to3 converter tool just wasn't sufficient and hat to delay the discontinuation of Python 2 for half a decade, because there was so much unportable code out there. In the end we had 12 years where the two languages lived side by side and my first job using Python had us use 2.7 for new projects due to library concerns despite 3 being around for more than 7 years already.
Unfortunately, it also had the effect of breaking language tutorials going all the way up to "hello world". A tough experience for new learners, who are least able to debug those simple errors.
Back when Python 3 was first released, the error message was not so clear, either...
This is why I appreciate languages with a strict and non-strict mode. Let me make that choice. In particular, if the parser is intelligent enough to understand what I meant and is just throwing a syntax error to be pedantic, then let me control that. Needlessly taking choices away will always be a frustration.
I have a pragmatic take on this if only because of Python's ease of throwing a script together and its non-technical userbase. Were it something that effects correctness I'd more fully support this break of backwards compat.
“Why don’t we just keep both?” probably has many answers. The overwhelming one for me is that if everyone’s doing the same thing two ways, then everyone has to learn all the esoterics about the print statement.
“Why don’t you just do it for me?” is a cardinal sin for a runtime. And already exists in migration tools.
Being able to use it in higher-order functions is somewhat un-Pythonic but I can accept that as a reasonable use for it. Would still prefer that print the statement is deprecated but usable.
I think composition is becoming increasingly more within pythonic idioms... it is well suited towards pythons usage as a high-level controller around low level libraries in other languages.
see also: the recent addition of match semantics in python
The problem is that the older syntax had way more ad-hoc oddities than just the lack of parentheses. Having to support weirdo statements like `print "hello", "world" >> sys.stderr` complicates the parser a lot and creates ambiguities, especially if you try to have it be both be this special-case statement and be a function object.
Is `print >> obj` a print call with redirection to the file-like `obj`? Or is it a operator call between those objects? What does `f = print` do, assign the function object or the result of calling print with no arguments?
If you type `func "arg"`, you get a generic syntax error.
If you type `print "arg"`, the syntax error is a bit more informative, but I wouldn't say that the compiler "understands" what you mean. It's just making a guess based on the fact that the previous token was "print".
On the other hand, supporting a special call syntax just for print, which includes more than just parentheses, would be substantially more complex. And then Python code in the wild would be more inconsistent and there will be endless debates about `print <args>` versus `print(<args>)`.
While it's been coming for a long time, it's still frustrating to see the cgi module go. It's going to create a load of busy work to rewrite some low-traffic scripts.
I see telnetlib is on the way out in 3.13 or so, which means a lot of pointless busywork in my future. Its a super useful module for a lot of tasks.
The suggested replacements either are asyncio based (which means whole ass rewrites as asyncio is really fucking opinionated), or are excessively restrictive in some stupid way.
"Infosec" at work tend to raise a ruckus when their scans detect "older" versions of Python on machines too, so virtualenv'ing to pin versions or similar is often not the most practical in prod without getting an exception.
I have a 35k$ worth Xray flat panel detector (still have >5k$ ebay value today), it runs embedded linux on an obsolete samsung ARM processor and uses telnet to talk to the console computer through a point to point ethernet link.
Apparently, it's impossible to migrate to ssh and why the ** I need to care about security here?
No. You understood the phrase incorrectly and assume I mean "bang the head against the wall and don't you telnet". Of course if you have no other choice, go for it. But if you do...
Ok. Telnet is a protocol, it is not synonymous with TCP.
As for unencrypted traffic generally, I see no reason for fetishizing encryption. "Compute" is not synonymous with teh cloudz. Cloud tech is not synonymous with teh cloudz. It's trivial to put e.g. Nginx in front of something.
If you need encryption between 127.0.0.1 and 127.0.0.42 you clearly don't trust the hardware you're running on; I'm sure someone's got some clever way of hiding the private keys, but "clever" doesn't mean provably secure. Between containers... meh. Show me you thought about it and have a threat model.
Between datacenters... not so much. Do I care whether the connection is encrypted, or whether all traffic goes through a tunnel? I'd like to see a threat model. Is sigint included?
Even across the globe I might make an exception for a well-reasoned argument. That would almost certainly have to be data which was an observable: the time, a flow rate. Encryption is not the same as tamper-proof. Even unencrypted data might need defense against that. (Maybe secrecy is easier than nonrepudiability. Maybe.)
Secrecy is not identical to privacy. Security is not identical to either of those, and hypervigilance isn't the same as not needing to be concerned with something (these days security can be either of those). Observability might be more important than either secrecy or privacy depending on the application, and who is doing the observing.
A lot of it depends on where you demarc your rings of hell and how you defend things.
I send data in the clear sometimes and I'm ok with it. I also use a telnet client for decidedly "not telnet protocol" purposes related to diagnostics as well as observability.
I dont see why a thread model is needed here, encryption is always important, especially in big corps. It seems like you're stating that a normal person doesnt need security.
Not sure why everyone thinks that telnet is "not suitable for today especially in the case of security", although telnet over TLS exists. There are also certain things that telnet can do which SSH cannot (e.g. transport data for a block terminal).
And everyone who uses "telnet as a program for connecting to a server on any port and talking the protocol manually" forgot that telnet is a real protocol and this type of abuse only works if the telnet client is sufficiently dumb / doesn't negotiate parameters. There is a reason why netcat exists.
Those are data formats, and a client is needed to support those (on both endpoints). You better just use something modern like ssh instead of hunting for clients (and making sure they're reliable).
Telnet as a way of logging in to a remote system is bad, sure, but telnet as a program for connecting to a server on any port and talking the protocol manually was great, especially before http got so complicated.
Yes, but that doesn’t mean you just leave telnet running decades after it became obsolete. That should be a time limited waiver and mitigations, and if you have legacy devices which absolutely need Telnet you should be planning for what you’ll do when something that old finally breaks and you have the resources to port relatively simple code.
This is circular: sure, if telnet is _obsolete_, then remove it. But being obsolete exactly means no one is using it anymore. If someone is using it, then it's not obsolete.
Regarding security, some would advocate that telnet, or whatever else, is secure at least as much as the network underlying it. So anyone who puts their "legacy" telnet apps on a VPC is fine, and has decades more to enjoy software that has already been running for decades.
Telnet has been obsolete since the turn of the century. That doesn’t mean that nobody uses it but it does mean that everyone who does should be upgrading away from it.
Trusting the network for security was common in the previous century but standards have improved since then. For example, sending your password in clear text is no longer considered acceptable by mainstream security standards since it avoids the risk of passive network monitoring or accidental exposure.
Obsolete means no longer in use or useful. It's been argued in this thread that it's both in use and useful. But yes, there are more secure protocols that overlap with most of what telnet can be used for.
Obsolete doesn’t mean something has no possibility of being useful but rather that it’s no longer commonly used because there are better alternatives. My son’s beloved steam locomotives still run but nobody uses them for normal commercial service because they became obsolete shortly after the invention of diesel and electric.
> (of words, equipment, etc.) No longer in use; gone into disuse; disused or neglected (often in favour of something newer)
Bringing that full circle, telnet used to be common but it has security issues (lack of encryption or integrity unless you tunnel over TLS, lack of modern authentication options, etc.) and so anywhere it’s still used we should be looking in to replacing it.
As discussed in the thread above, there are scenarios in which lack of encryption etc is simply not relevant. And when that is the case, why would you prefer a more complicated protocol with more moving things that can go wrong?
Telnet is still in use for its primary purpose, which is a bit different than a locomotive which has been neglected to a tourist attraction. Your last paragraph means you would like to make telnet obsolete, but it isn't yet. As evidenced in the thread, it's not partical or desirable to replace it with other technology in all the deployments of it. So, it's still hanging in there for now.
One does not use telnet as telnet when doing security work. Telnet + sclient (you can also use tons of other tools, too) lets you inspect many servers that have TLS security. SSH does not.
Kudos to the python core devs. The core library is getting cleaner and more maintainable and the language is getting more performant and powerful for users. Long live the snake
I’ve regularly used ‘python-build’ in docker container construction to build a minimal (or customised) self contained Python binary, libs tool chain, etc, in an isolated path to make copying to from the build container to the final artefact container. It’s just all around excellent tooling.
`python3.10 -m venv` or `python3.11 -m venv` the default approach works fine and I don't need pip anything to get a virtualenv, what's the selling point of pyenv?
For most major distros, the oldest version of Python that they ship is usually the one that's used by other packages that depend on Python. It's not uncommon - especially for libraries - to need testing on something older.
it's very easy to install them, probably easier than learning pyenv but I have not tried. I just want to use the default settings as much as possible, they're guaranteed to stay as long as python is alive and typically have less surprises for me on daily coding.
Whether it's easy or not depends on the OS you're working on. On Windows, it's just another executable installer, so it's trivial. On macOS, the official installers are terrible (they have no uninstaller), so you want another way. On Linux etc., some distros have only one Python version in the repos, some have two or three, but the latest versions of all reasonable distros won't let you install Python 3.4.
Not really windows, WSL. There's windows fork recommended. Classic python splitting the ecosystem up. Why something like this isn't in go or rust that's actually cross platform capable just seems like excess effort.
Because thanks to WSL the lingua franca of scripting languages is bash and that’s how pyenv started. Rewriting when it works fine on Windows and every dev I know uses WSL anyway is excess effort.
Hell, the fork could have used Go or whatever too but it went with a bunch of .bat scripts.
Indeed, I recently worked on a fairly simple deployment for a state government agency. When I described the process to their IT staff and mentioned that we prefer using WSL, they looked at me like I had grown 2 heads.
Even aside from that, WSL is only useful if you are writing code that will run on Linux at the end of the day. This is usually true for web apps, but that's not the only thing people write in Python. Libraries, in particular, need to be able to target Windows directly.
I should maybe test out pyenv again. I tried it several years ago but I ran into some limitations. Or at least I thought I did, I might just have used it wrong. My main usecase, back then at least, was to have just one global-like environment for every Python version I was working with. And sometimes and additional one for debugging/testing some different packagecombinations.
Back then it seemed like, to me, that pyenv only worked on a per-project basis, but I might just have followed incomplete guides and not looked into it deep enough.
How does pyenv compare with conda, mamba, pip and poetry? I typically use conda but ever since they broke with python 3.10 I have been considering moving to other environment managers.
I have not been able to understand why conda is so popular. I have no trouble with scientific computing with plain old `pyenv` and have never had any issues with C extensions or compilation with plain old `pip`.
Conda seems to be the most prone to getting in weird states or just hanging while "Solving environment." I have been happier leaving it behind.
Really the only two I would even consider using now are pyenv and poetry.
What's your platform? Issues with compilation are far more common on Windows, for example.
The other thing about Conda is that it doesn't cover just Python and Python packages, but all kinds of software that may be directly or indirectly relevant. For example, on Windows, there are Conda packages for the Windows SDK, and for various C++ compilers.
I love pyenv but hate that it forces me to specify the patch version when installing python. Most of the time I wish I could just enter 3.9 and have it give me the latest.
I use anaconda and every time I create a new environment I have to install jupyter and black in order to have access to them after running `conda activate $envname`.
If we're going to be making backwards incompatible changes to unittest, it'd be nice to introduce PEP8-compatible names, deprecate the old ones, and remove then in 3.20 or something.
(I know there's pytest and nosetests, but especially when teaching people, it's nice to use what's in the standard library and unittest sticks out for having things named differently)
> I know there's pytest and nosetests, but especially when teaching people, it's nice to use what's in the standard library and unittest sticks out for having things named differently
It’s a great time to be a Python developer. Python seems to be settling in to a really nice sweet spot of accessibility and power, and with the upcoming performance improvements it’s got an even brighter future.
Umm, I didn’t have a great experience with Poetry. For starters, I wanted to create a separate virtual env to install Poetry in and have Poetry install the project’s dependencies in a separate virtual env. There was simply no way to do that and I didn’t wanna install Poetry globally on my machine. I also didn’t wanna use docker just for this, so I realized that Poetry could just not be for me.
This isn’t a poetry problem. Use pipenv for this. It’s pip but uses separate venvs to install packages. Pipenv install poetry will do exactly what you want I think.
I hope we don't end up with the same things as the java world. JDK9 removed and moved to a library a few things like JAXB. Since JDK11, every new version is harsher for programs that use reflection to mess with non-published internals. Both are good changes, BTW.
In theory, you add a few libs and you're ready to upgrade. In practice, adding libs at scale is hard, and quite a few dependencies of dependencies of ... are using reflection to mess with internals.
As a result, there are still a lot of enterprise applications on JDK8 and JDK11, even if the security impact of this is bad and the fix should be easy.
I disagree. The entire point of setAccessible is to say I want to access nonpublic things. I shouldn't have to also say "Simon says, pretty please" for each such access on the command line to be able to do so.
Marking random hidden internals accessible doesn't mean a private API becomes public. setAccessible is a useful tool for debug access, or for when an annotation gives extra guarantees, or you can use it on code you have access to.
But you can't expect from your API vendor that they won't ever again change the internals of the implementation just because you forced your way in and monkeyed with the internals. Writing your code this way is a surefire way to need rework at some unpredictable time in the future.
If you're lucky, it just crashes. A worse possibility is subtly corrupting and destabilizing your program. That was what happened in some of these cases: hashcodes were stored/cached in collections after an upgrade, and people using reflection to 'restore' a collection didn't update these caches correctly or dropped them in the wrong hash bucket. Then the hashmap had elements that were both there and not there, depending on how you queried it.
One of differences of a senior engineer is that you not only say it works today, you can guarantee how it stays working long term, even when the environment changes. Things like tests and comments are part of that. API contracts are a big part of that, both in being explicit about them as an API provider, and not touching non-guaranteed parts as an API consumer. Using setAccessible like this is a grave violation of an API contract, and it takes away your ability to upgrade to later version.
It not only pointed out places where I was using a deprecated alias, but also fixed a few places where I was using the API poorly (using "assertTrue(a binop b)" instead of "assertBinOp(a, b)".
I can't comment on the quality of the tool, I just don't like this attitude of "just use this tool we don't support, hosted on some platform we don't control, which might disappear tomorrow". Back in the day, this tool would have been shipped (and supported) with stdlib, or at least be frozen somewhere "official".
That tool is not a permanent addition to the list of dependencies, but for a single migration effort. As such, it is unnecessary to include it in the standard library.
If you really go "back in the day", this sort of tool wouldn't exist at all, and you would be expected to do it manually, because there were no easy ways to modify the syntax in a space- and comment-preserving way. ;)
That was developed for 2to3.
Huh. I did not know this - in lib2to3, the "asserts" fixer handles this sort of conversion, so you have the ability already .. so long as you don't use new Python 3.10+ syntax that lib2to3 doesn't handle. See https://docs.python.org/3.7/library/2to3.html#2to3fixer-asse... .
I wonder if serhiy-storchaka (who committed that What's New entry) knows about lib2to3's "asserts" fixer. And if that might be a relevant addition or change to the documentation.
Someone who cares about this might want to point it out.
Sad to see distutils going away. I have very old projects which use distutils for distribution. Wrote them in Python 2 days and they were ported easily to Python 3. But distutils going away is going to break them.
I know setuptools is advanced and recommended but distutils worked fine for me and my users for so many years. It is going to be overhead for me to comb through all my projects and replace distutils with setuptools, test them and testing out packaging and distribution is quite a lot of work.
I am growing unhappy with Python due to these breakages. Is there some other programming language whose maintainers don't break the "user space" like Python has been doing time and again?
I've been pretty happy with Ruby. I don't ever recall changing my ruby scripts for a new release (whereas with Python it's been many times). Rails is a different story although rails is not ruby. Ruby is similar enough to python that most pythonistas can already read it, so learning effort is mainly just the API differences.
> Remove the filename attribute of gzip.GzipFile, deprecated since Python 2.6, use the name attribute instead.
2.6 wow, that's some compatibility right there. About time that got removed then! Either that or undeprecate it, if it's fine to use. Any decision at this point is a good one.
Is there a standard way a Python script can ask for an older version of the environment? If I could insert a few lines of code into the top of the WikidPad source, to specify an OLDER version of wxPython, for example... I'd be able to switch to Linux from Windows.
As it is now, there were breaking changes in wxPython (likely due to the culture of breaking working code extant in Python) which result in WikidPad being broken.
It seems to me that if the Python community continues in this direction, nothing will work more than 3 months after it's last github commit.
[edit/append] No, I'm not sure it's wxPython and not wxWindows that is the issue. I'll have to stuff my Linux boot SSD in, and then try to build WikidPad again to know for sure (it's been too long)
I didn't write WikidPad, it seems to have been last maintained about 2012, but I use it for my notes, etc. I really like it, but the breakage on the Linux side is a show stopper. It's really unfortunate that Linux doesn't have a stable API like Win32, and forces dependence on source code.
I'd consider installing an older version of Linux to force older python, wxPython, wxWindows, etc... and try to figure it from there... the last time I looked at it I got a wall of confusing errors, and couldn't patch it enough to get any functionality out of it.
wxPython is a Python module. It isn't the same as Python itself. Don't blame the language for issues you're having with libraries written in it.
On top of that, wxPython is a Python module which interfaces with an external library (wxWindows). Are you sure your problems are even with the Python module, and not the result of breaking changes in wxWindows itself?
Talking about depreciated item. There are a lot of pyenv in the discussion. And even suggest it is to augment venv. But is that depreciated is since 3.6?
“Deprecated since version 3.6: pyvenv was the recommended tool for creating virtual environments for Python 3.3 and 3.4, and is deprecated in Python 3.6.”
Version is a nightmare btw. I have Conda just installed but sometimes the python is not the same as python3, one to 3.9 and one to 3.10. And unless you know the difference of pip3, pip and python -m pip (and not pip3…).
Using venv then somehow the script Kivy switch it out and fall to python 3.9. Where is it? And then it fail as there is no python in my macOS but just python3. Struggle whole night on how to fix the bash alias and fail then use symbolic link.
Just a “user” guys. Do not confuse me please.
“ There should be one-- and preferably only one --obvious way to do it.” sigh.
This is enough for me to swear off python for any long-lived or production system. I am tired of this kind of endless incompatibility. It has fractured the ecosystem sufficiently that things break all the time when trying to use any non-trivial set of dependencies. Maybe the developers will stop this crap with Python 4. It’s the next Perl 6!
Speaking of standard lib, today I was looking to do a sliding window over an iterator and ended up on the pairwise function in itertools that was added in 3.10.
Seems like such a wasted opportunity, to add a sliding window for n=2, and then in the documentation add a recipe for a sliding window function for any n.
Their own decisions. They want to spread little breakages over many realeases instead of collecting them into big breakages, due to the 2 to 3 fiasco. Feels like an over-correction imho.
> Anyone care to comment on what they are using for receiving emails in Python?
In most cases, you use an existing MTA (like Postfix or whatnot) and set it up to deliver mail to a Python script. Or, even less directly, you use an IMAP library to access mail after it's delivered to a mailbox, or use a mail provider which can call a webhook over HTTP when an email is received.
Is there any model or rationale behind introducing breaking changes in a minor release? What are the version numbers good for if not to indicate how important the changes are?
The c stdlib is more tightly coupled to the OS then any particular Compiler; and is not changed when you select a particular dialect (though the language standard dose specify what stdlib must support).
The title is referencing this line (not TFA's actual title):
> In the unittest module, a number of long deprecated methods and classes were removed. (They had been deprecated since Python 3.1 or 3.2).
I doubt your projects are using 3.1 or 3.2 (or older, except perhaps 2.7, but then you wouldn't care that 3.12 was removing something deprecated vs 3.11 which has it)?
I do not know what is wrong with Python, because I use it only occasionally, for short programs that do not use obscure features.
Nevertheless, there is no doubt that it is the worst software project that I have ever seen, during several decades, from the point of view of keeping compatibility between versions.
For other programs, I may happen to need to have installed 2 versions, or maybe 3 versions, at most, in order to be able to use other programs that are compatible only with certain versions.
On the other hand, for Python, during many, many years, I have been forced to keep installed all the time around 7 or 8 versions, to keep happy many other programs that claim to be compatible only with certain Python versions (and not also with any newer versions; some programs are even compatible only with a list of non-consecutive versions).
Building from source any program that depends on Python may frequently require various temporary configuration modifications, to ensure that the program is built only for the Python versions with which it is compatible.
I will say this, I love python, and I hate python the way only someone who loves it can. I think everything you said is accurate.
I love writing Python code. I love its standard library and many third party libraries. But I loathe Python’s dependency management, and Python version upgrades are sometimes a huge pain. There are tools to help with this (pyenv or I am fond of asdf for python versions; poetry or pipenv for isolating package dependencies), but they only partially solve the problems and introduce new ones.
> An example of a more recent deprecation is the 'distutils' module which was deprecated in 3.10.
distutils was functionally deprecated long before that. Even the Python 2.7 documentation (released in 2010!) recommended that users avoid distutils and use setuptools instead: https://docs.python.org/2.7/library/distutils.html
Deprecated does not mean removed. They are still there, working fine, having no big reason to change them. I know it, as I just checked my and my workplaces code, and there are several usages of deprecated code. And we are on 3.9/3.10.
So what? As long as it works, don't change it. This parts are deprecated since up to nearly a decade (Python 3.2 was released February 20th, 2011). And they will work for some more years. Some have not even a working replacement yet. Investing now in changing old code just to be compatible to a not even released version, is a bit pointless if you have more important work on the list.
Python 2 was supported for way longer than initially announced. Distutils as well. Maybe deprecated is not a strong enough signal because many projects hesitate to cut stuff loose that they deprecated because of the community backslash. Sometimes things have to be deprecated, but the community will never get around to adapt to that if they stick around forever.
Compared to that, Java still carries some deprecated items around. But with Java 9, deprecation for removal was introduced, which is a strong signal that stuff will be gone in two releases. Kubernetes has a strict deprecation policy as well, such that skipping more than two releases is asking for trouble.
The move to web based apps or connected in general was a genius move by the programming community.
Now there will be an endless churn to keep programs up to date for "security reasons" and that cost money.
We can even pull the plug on programs nowadays to the give the users the very best experience! Otherwise the lusers might have felt satisfied with what they have.
I recently upgraded a Python 2 + Django 1.x project with no tests to Python 3 and latest Django.
Using 2to3, the most trouble I had were db migrations, which I just squashed. The rest were ferreting out str() problems that 2to3 didn't find (eg. redis package happily taking strings but returning bytes by default), Django regexp url changes, and trivial stuff like that.
It took a couple of days work.
I also wanted to upgrade the frontend part (webpack build of Vue). I stopped after a few hours od going nowhere and am still apprehensive of approaching that particular thing.
Or, the glass half full view of the world suggests that low usage high barrier to maintenance module deprecation increases overall long term improvements and compatibility at the cost of small, planned, notified, short term inconvenience.
I think you're missing the point. Nothing is wrong with deprecation, but it should be done properly. Ideally, breaking changes should correspond to a major version increase
Several groups of people offered to maintain python 2. They were told very clearly they could not do so "officially", and we're even threatened with lawyers if their thing looked like it could be mistaken for "Python 2".
As you can see, it is still offered, and the PSF doesn't have a problem with that.
The problem with those other groups is that they didn't want to maintain Python 2. They wanted to take it and develop it, evolving the language separately from Python 3 - while also calling the result "Python 2.8" (and presumably later Python 2.9 etc). It stands to reason that people who own the brand don't want it to be associated with a third-party fork that's making its own major design decisions, no?
Do you have pointers to that? The one I saw was called “python 2.8” which is definitely a point of confusion for people who might think it was supported.
Yes, that’s the one. Note that they’re still there – the PSF just didn’t want them calling something which isn’t made by the Python developers Python 2.8. Once they adopted the Tauthon name they were fine.
Apart from requiring security fixes, they also sometimes block changes and newer features. At that point, the maintainers might have to choose between breaking the obsolete module or shelfing the feature/fix.
Deprecations are the way to indicate that support for the module will be eventually dropped. There is no reason to leave those modules there if they cannot be relied on.
“Deprecated” usually implies intent to remove. Regardless, even dealing with such rejections is a support burden. And security vulnerabilities will tend to be fixed, even in deprecated code.
For languages where I have 10+ years of work behind, it's the exact opposite, and where I see the c/c++ model of not breaking backward compatibility a much saner choice.
Python in particular is an extremely bittersweet pill to swallow. The amount of small and large breakage I had ever since I started to use it for production work (v2.6 and onward) has been relentless. Small and large breakages requiring constant retooling and reworking. Minor release? Yeah, still breaks just as much as a major one. pyenv doesn't help when you dependencies need to be updated, and the updates do not support the lower versions you wanted to use anyway. Containerizing everything is a not a solution for a project is expected to be supported for years so the real way forward is to fix it, again, and again, and again. My experience is that every 6 months there _will_ be work just to fix bitrotting. A one year old python project will hardly even run unless it's using the stdlib only.
By comparison, I never experienced such churn with perl.