thank you for the history lesson
trio and anyio fixes that
One phrase not found in the article, colored functions
thank you for the history lesson
trio and anyio fixes that
One phrase not found in the article, colored functions
This is a Linux post. Has nothing to do with Python
Free threaded came on the scene and packages slowly added support. So there is a will to gravitate towards and adopt what works. Albeit gradually.
I prefer typing_extensions over typing and collections.abc
With typing_extensions, new features are always backported. With Python features, have to continuously upgrade Python. Whatever you upgrade to is already guaranteed to be very temporary. It's much easier to upgrade a package.
For the same reasoning would prefer Trio over asyncio.TaskGroup.
Which leads to the question Trio vs asyncio.TaskGroup?
asyncio.TaskGroup is a py311 feature with context kwarg added in 3.13. The documentation is very terse and i'm unsure what guarantees it has, besides strong. Missed opportunity. Could have used the adjective, Mickey mouse. Both are essentially the same, useless.
Having to upgrade to 3.13 is what i call failure to backport or simply, failure or that's what failure looks like.
Give a free pass to free threading, but everything else, no!
Having to upgrade Python to have access to sane structured concurrency is silly. Have the exact same complaints about Package Managers.
!r is a thing
!s is a thing
There is some syntax for formatting a float which will completely be forgotten that'll have to be looked up.
There is nothing else worth knowing.
Now lets moan and complain about something actually important. Like repos with languishing PRs, like SQLModel.
Upvote for the sanity check.
As the OP mentioned, this is a proposed/draft feature that may or may not ever happen.
With these kinda posts, should start a betting pool. To put money down on whether this feature sees the light of day within an agreed upon fixed time frame.
Why the commercial license for pngquant? Maybe rewriting pngcrush IP and slapping a commercial license on it is copyright infringement. This is my impression of Rust. Take others IP, rewrite it in Rust, poof copyright magically transferred. The C99 version how much of that is from prior art?
Lets just ignore prior art and associated license terms
written by Kornel Lesiński
ImageOptim Ltd. registered in England and Wales under company number 10288649 whose registered office is at International House, 142 Cromwell Road, London, England, SW7 4EF
First commit Sep 17th, 2009
Copyright (C) 1998-2002, 2006-2016 Glenn Randers-Pehrson
glennrp at users.sf.net
Portions copyright (C) 2005 Greg Roelofs
i'm a fan of ladies with complete test coverage
but i'm ok with those who are a fan of type inference.
More ladies for me
Oh btw there are three choices, not two.
pydantic underneath (pydantic-base) is written in Rust. fastapi is fast cuz of pydantic. fastapi is extremely popular. Cuz it's right there in the word, fast. No matter how crap the usability is, the word fast will always win.
If fastapi is fast then whatever is not fastapi is slow. And there is no convincing anyone otherwise so lets not even try.
Therefore lets do fast. Cuz we already agreed slow would be bad.
normal dataclasses is not fast and therefore it's bad. If it had a better marketing team this would be a different conversation.
SQLModel combines pydantic and SQLAlchemy.
At first i feel in love with the SQLModel docs. Then realized the eye wateringly beautiful docs are missing vital details, such as how to:
cls.__name__.lower()#2 is particularly nasty. SQLModel.new implementation consists of multiple metaclasses. So subclasses always inherit that worthless tablename implementation. And SQLAlchemy applies three decorators, so figuring out the right witchcraft to create the Descriptor is near impossible. pydantic doesn't support overriding tablename
Then i came along
After days of, lets be honest, hair loss and bouts of heavy drinking, posted the answer here.
Required familiarity with pydantic, sqlalchemy, and SQLModel.
There is an expression, Linux isn't free it costs you your time. Which might be a counter argument against always using only what is built in.
I'm super guilty of reinventing the wheel. But writing overly verbose code isn't fun either. Never seem to get very far.
people are forced to install dependencies
This ^^.
If possible, Python dependency management is a burden would prefer to avoid. Until can't, then be skilled at it!
disclosure: i use/wrote wreck for Python dependency management.
Compiled languages should really live within containers. At all cost, would like to avoid time consuming system updates! I can no longer install C programs cuz on OS partition ran out of hard disk space. Whereas Python packages can be installed on data storage partitions.
for Python, I usually deliver the script as a single .py file I'm sure you are already aware of this. So forgive me if this is just being Captain Obvious.
Even if the deliverable is a single .py file, there is support for specifying dependencies within module level comment block. (i forget the PEP #).
I don’t like that (unless its a shell script, but that is by its nature a dependency hell) You and i could bond over a hatefest on shell scripts, but lets leave this as outside the discussion scope
And your argument As the complexity of a .py script grows, very quickly, comes to a point the deliverable becoming a Python package. With the exceptions being projects which are: external language, low level, or simple. This .py script nonsense does not scale and is exceedingly rare to encounter. May be an indication of a old/dated or unmaintained project.
From a random venv, installed scripts:
_black_version.py
appdirs.py
cfgv.py
distutils-precedence.pth
mccabe.py
mypy_extensions.py
nodeenv.py
packaging_legacy_version.py
pip_requirements_parser.py
py.py
pycodestyle.py
pyi.py
six.py
typing_extensions.py
What is the root basis of your external package reluctance? Please explain cuz that's really where the juicy story lies.
As technologists things change and advance and we have to adapt (or not) with the times. Maintaining the universe by ourselves is impossible, instead almost all of our tech choices are from what's available. And only if/when that is insufficient do we roll up our sleeves.
More and more packages are using click. So there is a good chance if you look at your requirements .lock file that it's already a transitive dependency from another dependency.
Or said another way, show me 5 popular packages that use argparse and not click and use dataclasses and not attrs
why click is based on optparse and not argparse
Applies only to optional args long help
Applies only to subcommands short help
Special mention to how to document positional args. The docs explains the intentional lack of help kwarg for positional args.
./thing.py -h
./thing.py subcommand --help
Lists all the subcommands with one line short description for each subcommand.
Lists detailed docs of one subcommand
My opinion having used both argparse and click, click is simpler cleaner and less time consuming.
No endless scroll of algorithmic 'content'
That's not the case. According to the docs,
itter watch [mine|all|#chan|@user] would entail infinite scrolling of content.
itter timeline [mine|all|#chan|@user] [<page>] although there is pagenate the content list is potentially infinite
Admit it! There is no search algorithm for content filtering or finding contacts or blocked users.
Whatever the object is, there is no search algorithm to traverse it intelligently.
For example, say i'm super popular but with a tendency to ghost everyone. Like a stereotypical LINE user faced with unpopular opinions or topics. So list of unfollow'ed (aka blocked) users is approaching infinity.
A truly admirable dedication to being a really horrible human being.
At the local bar, me and my mates have a drinking game where they think up random search criteria to see the kinda categories of people which have been blocked. A weak or nonexistent search algorithm would mean not likely to leave the bar on our feet.
Show me chicks with green hair that posts about both climate doom and vaccines being great for children. Living in USA, Canada, or New Zealand. That has a cat avatar and has either giant earrings or nose piercing.
That should be simple enough.
yep look at that! i ghosted five green haired freaks. Now drink!
what's your secret name for the project?
the ludwicks of Void Linux ftw!
i'd actually like to do something else with my lifetime besides constantly being tossed around for no apparent benefit. i'm sure there is a good excuse. There always is.
Appreciate feedback once you've had the chance to evaluate wreck.
Feel free to make an issue. Which is the best way to catch my attention.
In the CHANGES.rst, there are lists for both feature requests and known issues
wreck is a dependencies manager, which is venv aware, BUT is not a venv manager nor does it put dependencies into pyproject.toml. Sticks with good ol' requirement files.
Assumes, for each package, it's normal to be working with several venv.
Syncs the dependencies intended for the same venv.
req fix --venv-relpath='.venv'
req fix --venv-relpath='.doc/.venv'
Across many packages, unfortunately have to resort to manually sync'ing dependencies.
Lessons learned
Wrote wreck cuz all the other options were combining requirements management with everything including the bathroom sink. build backends ... venv management ... everything goes into pyproject.toml. All these ideas seem to just compound the learning curve.
Less is more especially when it comes to learning curve.
Three-argument pow() now tries calling rpow() if necessary. Previously it was only called in two-argument pow() and the binary power operator. (Contributed by Serhiy Storchaka in gh-130104.)
that's a nail or wart that has been sticking out since forever
ok fine lets talk about this Linux distro
Don't want to be a package manager database on my off hours. Why is having users manage every transitive dependency a good design?
I'm asking i really don't understand the merits of adopting this heavy burden