The Zen of Python is a collection of 19 “guiding principles” for writing computer programs that influence the design of the Python programming language.[1] Python code that aligns with these principles is often referred to as “Pythonic”…
- Beautiful is better than ugly.
- Explicit is better than implicit.
- Simple is better than complex.
- Complex is better than complicated.
- Flat is better than nested.
- Sparse is better than dense.
- Readability counts.
- Special cases aren’t special enough to break the rules.
- Although practicality beats purity.
- Errors should never pass silently.
- Unless explicitly silenced.
- In the face of ambiguity, refuse the temptation to guess.
- There should be one– and preferably only one –obvious way to do it.[c]
- Although that way may not be obvious at first unless you’re Dutch.
- Now is better than never.
- Although never is often better than right now.[d]
- If the implementation is hard to explain, it’s a bad idea.
- If the implementation is easy to explain, it may be a good idea.
- Namespaces are one honking great idea – let’s do more of those!
Tag: Programming
Constantly Wrong and Out of Your Depth – Psychology of Programming
“The thing that gets lost, and which I think is important to know, is that programming is never easy,” he says. “You’re never doing the same thing twice, because code is infinitely reproducible, so if you’ve already solved a problem and you encounter it again, you just use your old solution. This means that by definition you’re kind of always on this frontier where you’re out of your depth. And one of the things you have to learn is to accept that feeling—of being constantly wrong and not knowing.”
Which sounds like it could be a Buddhist precept. I’m thunderstruck.
“Well, constantly being wrong and out of your depth is not something people are used to accepting. But programmers have to,” he concludes.
Devil in the Stack: Searching for the Soul of the New Machine
Andrew Smith
User Names – Programming Snags with
tonyHawkandthetaleofFeaturenotabug
byu/GoldenBaby2 inProgrammerHumor
Classy_Mouse
I know some people that only have a given name. No family name. So when they came over to Canada, they had a lot of issues with official forms. Some of them split their name into 2 names, some just repeated their given name twice
Toloran
True story, I went to middle school with a kid whose entire name was ‘Rainbow’. I initially assumed his parents were hippies or something, but it turned out they were hippies and indecisive: They both had different last names, but couldn’t decide which to give him. So they just didn’t give him one.
EODdoUbleU
My first name is a single letter. The amount of shit I can’t do without creating some bastardization to fulfill the mUSt cONTaIn a miNImUM Of tWO ChAraCTeRS bullshit is annoying as fuck.
Airport kiosks are absolutely the fucking worst because their system won’t let me put my legal name in, but I have to use my legal name to pass security.
FormerGameDev
aye, like Metallica’s longest lasting bass player, Roberto Agustín Miguel Santiago Samuel Trujillo Veracruz
thrye333
I like the point near the end about names being consistent across systems, because when I was getting ready to go apply to colleges, I found out that most of them had my last name misspelled. I have a common English first name as my last name. I have never seen it spelled how 75% of those colleges spelled it.
I have no idea how they got that spelling. I don’t even know how they had my info. But that’s college mailing lists, I guess.
Dalimyr
I used to work in a hospital, sharing an office with another team who told a story about how people testing in their system (in prod, because you’re lucky to have a proper test environment in the public sector) would use Simpsons characters for their tests. People who knew this got accustomed to filtering out “Bart Simpson”, “Lisa Simpson” and all that…until one day this instinctive behaviour impacted a patient (I can’t remember what happened – if it was an appointment being deleted or something like that) because their name was Margaret Simpson and someone had erroneously thought this was just test data.
I don’t think this incident actually stopped the person from continuing to create Simpsons test data, but yeah, that shit happens from time to time.
AaronTheElite007
I wonder if someone has the last name ‘null’?
IMightBeErnest
I heard a story about a guy who made his license plate “null” and ended up getting assigned all of the tickets where a license number wasn’t known (or bugged, or something). Point is, he got like a bajillion fines he had to contest.
Lambda Calculus – Brief Overview of
Old Code – Load Bearing Bugs
The World Depends on 60-Year-Old Code No One Knows Anymore
byu/debordian inprogramming
Ancillas
Had to learn COBOL and JCL at my first job. It wasn’t tremendously hard, but there was a huge volume of programs to understand.
At one point I was maintaining code written in 1969 where there was a bug that no one was supposed to fix because there was 45 years worth of programs that assumed that bug was present.
midri
Load bearing bug
ejfrodo
I spent a while building a code analysis tool for an ancient proprietary language that was basically a superset of Pascal. A multi billion dollar company was built on this. Nobody at the company fully knew how it worked because they’d all retired years ago so I just had to be a code archeologist and hunt through the ruins. That was an interesting project
giantsparklerobot
Part of the problem with COBOL is it was meant for non-programmers to be able to encode/automate literal business logic. Maintaining or replacing a COBOL system isn’t only about just the code. The under-specified business process needs to be reimplemented with 50-60 years worth of special exemptions and in many cases load bearing bugs.
You can learn COBOL as easily as any other language. It’s much harder to bring in someone that understands the business process that COBOL automated.
RearExitOnly
Finally someone who gets it. I was well paid not because I was the worlds best COBOL programmer, but because I was an expert on grain contracts. In the Midwest I was never out of work and usually made more than double what PC guys were making back in the 80’s.
Maleficent_Mouse_930
I am currently leading a squad developing a replacement for a core banking program originally In COBOL, and the complexity is 100% in the actual operations. The code ain’t too tough, but it’s been 18 months so far of reverse engineering back and forth with the finance and compliance guys trying to nail down what’s needed. Every time we find a new intricacy and tell them, they’re like “Oh yeah, didn’t we tell you that?”
Morons. They’ve been relying on this software so long they don’t even know their own jobs any more.
Overuse of Inheritance – Object Oriented Programming
Many of these books claim that by creating these abstractions, in this example Person, you can “re-use” them in other projects. However, in 15 years of my professional experience, I have hardly seen many (or any) examples of such abstractions being used across projects in a useful manner.
At one point, I started creating a class library of these abstractions that I could share across projects. After a little while, that class library ended up being bloated with lots of unrelated abstractions and versioning it across different projects ended up being a nightmare. If you have done this before, you may relate to this story.
Once I was engaged as a consultant for a greenfield project, and in the introductory session, the data architect of the project walked me through tons of UML class diagrams (more than 50 pages). Every class was inheriting from another class and eventually, they all led to a class that was called “Thing”! No joke!
What text books tell you about inheritance in OOP is wrong
Mosh Hamedani
Developers Discuss Pros and Cons of Programming Languages – Slashdot
Linus Torvalds Says Rust Closer for Linux Kernel Development, Calls C++ ‘A Crap Language’
Asked about a suggestion by a commenter on the Linux Weekly News website, who said, during a discussion on the Google post, “The solution here is simple: just use C++ instead of Rust”, Torvalds could not restrain himself from chortling. “LOL,” was his response. “C++ solves _none_ of the C issues, and only makes things worse. It really is a crap language.
“For people who don’t like C, go to a language that actually offers you something worthwhile. Like languages with memory safety and [which] can avoid some of the dangers of C, or languages that have internal GC [garbage collection] support and make memory management easier. C++ solves all the wrong problems, and anybody who says ‘rewrite the kernel in C++’ is too ignorant to even know that.”
——
“There is a class of programmer that “clicks” with the OOP concepts of inversion of control, dependency injection, and always coding to an interface. This approach feels so correct to them that they become almost religious in their devotion to it. They believe it is the one-size-fits-all solution to every programming problem, and consider anyone who disagrees with them to simply be less intelligent than themselves (because no rational person could possibly disagree).
I have worked with such people. They quadruple the codebase, significantly increase development time, and introduce more bugs by doing so, all the while insisting that this will make the code easier to maintain in the long run. But it actually doesn’t; it just means that there is that much more code that must be refactored as new features come along.
They point at the ease with which layers can now be swapped-out as a total win. But in practice 90% of the interfaces have only one implementation. There is nothing to switch anything to because there just isn’t any need. And future development isn’t simplified by finding just the right layer to adjust; the changes are almost always cross-cutting so you have to refactor all those interfaces and layers top-to-bottom, each time, to get things working.
It just doesn’t work out in practice the way they insist it will. At least, not often.”
——
“Just use JavaScript, make the kernel run off node.js.”
GUID – Explained Clearly
There are multiple GUID generation algorithms, but I’ll pick one of them for concreteness, specifically the version described in this Internet draft.
The first 60 bits of the GUID encode a timestamp, the precise format of which is not important.
The next four bits are always 0001, which identify that this GUID was generated by “algorithm 1”. The version field is necessary to ensure that two GUID generation algorithms do not accidentally generate the same GUID. The algorithms are designed so that a particular algorithm doesn’t generate the same GUID twice, but without a version field, there would be no way to ensure that some other algorithm wouldn’t generate the same GUID by some systematic collision.
The next 14 bits are “emergency uniquifier bits”; we’ll look at them later, because they are the ones that fine tune the overall algorithm.
The next two bits are reserved and fixed at 01.
The last 48 bits are the unique address of the computer’s network card. If the computer does not have a network card, set the top bit and use a random number generator for the other 47. No valid network card will have the top bit set in its address, so there is no possibility that a GUID generated from a computer without a network card will accidentally collide with a GUID generated from a computer with a network card.
Once you take it apart, the bits of the GUID break down like this:
- 60 bits of timestamp,
- 48 bits of computer identifier,
- 14 bits of uniquifier, and
- six bits are fixed,
for a total of 128 bits.
The goal of this algorithm is to use the combination of time and location (“space-time coordinates” for the relativity geeks out there) as the uniqueness key. However, timekeeping is not perfect, so there’s a possibility that, for example, two GUIDs are generated in rapid succession from the same machine, so close to each other in time that the timestamp would be the same. That’s where the uniquifier comes in. When time appears to have stood still (if two requests for a GUID are made in rapid succession) or gone backward (if the system clock is set to a new time earlier than what it was), the uniquifier is incremented so that GUIDs generated from the “second time it was five o’clock” don’t collide with those generated “the first time it was five o’clock”.
Raymond Chen, devblogs.microsoft.com
Practice of Programming, From Preface
Have you ever…
- Wasted a lot of time coding the wrong algorithm?
- Used a data structure that was much too complicated?
- Tested a program but missed an obvious problem?
- Spent a day looking for a bug you should have found in five minutes?
- Needed to make a program run three times faster and use less memory?
- Struggled to move a program from a workstation to a PC or vice versa?
- Tried to make a modest change in someone else’s program?
- Rewritten a program because you couldn’t understand it?
Was it fun?
These things happen to programmers all the time. But dealing with such problems is often harder than it should be because topics like testing, debugging, portability, performance, design alternatives, and style—the practice of programming—are not usually the focus of computer science or programming courses. Most programmers learn them haphazardly as their experience grows, and a few never learn them at all.
Kernighan, Brian W.. The Practice of Programming
The Six Levels of Debugging
Getting Started with Python and VS Code
I thought this was a pretty good intro to running python and VS Code.
Note you want to have Python and VS Code installed. It goes from there.
Is a beanbag a chair? Object Oriented Programming Dilemma.
It is hard to talk about a “chair” if nobody agrees on what a chair is. There is enough of a common example base in OO, the shape, animal, and device-driver examples; that one can start, but beyond that the nature of OO diverges from person to person.
I’ll take that challenge. Find a definition of chair. For any said definition of finite length, their is either an exception to the definition or a thing that is a chair that isn’t covered by the definition. And yet, we can still talk about chairs.[A lot of this is because OO is a broad church embracing everyone from the prototype-based (Self, Io, JavaScript) to the class-based (Java, SmallTalk, etc ) to those who have built OO systems on top of other paradigms (CLOS, OCaml, various Scheme dialects, Python, Perl). Each of these have various flavors of usage as well so talking about OO without qualifying it usually becomes a meaningless debate about whose definition we shall use.
We can only talk about chairs if we first state that we’re only interested in wooden 4-legged chairs.]
I suppose we have beanbag chairs that are borderline “mini-couches”. But, this gets back to the need for a working classification system for OO. I don’t know if “modeling” can be separated from language or not.
Five Open-Source Projects AI Enthusiasts Might Want to Know About
TensorFlow
The Google Brain team created TensorFlow. Its underlying software powers some of the technologies that Google uses today. It translates languages, improves search engine results, recognizes pictures in Google Photos, and understands spoken words, making its machine learning (ML) capabilities genuinely awe-inspiring.To the surprise of the tech community, Google open-sourced TensorFlow, making it available to everyone. Developers can create ML models, classes for these models, and write imperative forward passes with it, among others. TensorFlow uses Python, C++, and CUDA.
Brittany Day, linuxsecurity.com
5 Chrome DevTools Utilities You Need to Know
3. monitor / unmonitor
If we want to track when a method is called on our page we can use the monitor() function…
Alex Ritzcovan,
https://medium.com/dailyjs/5-chrome-devtools-utilities-you-need-to-know-5bfe58c75773
the link was via npm mailing list:
We came across this handy guide from Alex Ritzcovan for the Chrome users out there, including some lesser-known Chrom DevTools utilities you might not be aware of. Here are 5 of their favorite tools provided by the DevTools team.
go to https://www.npmjs.com/ to sign up or browse.
Should Coal Miners Learn To Code? Slashdot discusses
During a campaign event on Monday, U.S. presidential candidate Joe Biden “suggested coal miners could simply learn to code to transition to ‘jobs of the future,'” reports Newsweek:
“Anybody who can go down 300 to 3,000 feet in a mine, sure in hell can learn to program as well, but we don’t think of it that way,” he said… “Anybody who can throw coal into a furnace can learn how to program for God’s sake…”
Many Twitter users criticized Biden’s comments as reductive. “Telling people to find other work without a firm plan to help them succeed will never be popular,” communications professional Frank Lutz wrote… Congressional candidate Brianna Wu tweeted that she was “glad to see the recognition that you don’t need to be in your 20s to do this as a profession,” but also called Biden’s suggestion “tone-deaf and unhelpful.”
Long-time Slashdot reader theodp notes the response this speech got from New York magazine’s Sarah Jones: “Please Stop Telling Miners To Learn To Code.” And in comments on the original submission, at least two Slashdot readers seemed to agree. “Not everyone can code and certainly not every coal miner or coal worker,” wrote Slashdot reader I75BJC. “Vastly different skills.”
Slashdot reader Iwastheone even shared a Fox News article in which rival presidential candidate Andrew Yang argued “Maybe Americans don’t all want to learn how to code… Let them do the kind of work they actually want to do, instead of saying to a group of people that you all need to become coders.”
But is there something elitist in thinking that coal miners couldn’t learn to do what coders learned to do? It seems like an interesting question for discussion