Many of these books claim that by creating these abstractions, in this example Person, you can “re-use” them in other projects. However, in 15 years of my professional experience, I have hardly seen many (or any) examples of such abstractions being used across projects in a useful manner.
At one point, I started creating a class library of these abstractions that I could share across projects. After a little while, that class library ended up being bloated with lots of unrelated abstractions and versioning it across different projects ended up being a nightmare. If you have done this before, you may relate to this story.
Once I was engaged as a consultant for a greenfield project, and in the introductory session, the data architect of the project walked me through tons of UML class diagrams (more than 50 pages). Every class was inheriting from another class and eventually, they all led to a class that was called “Thing”! No joke!
Linus Torvalds Says Rust Closer for Linux Kernel Development, Calls C++ ‘A Crap Language’
Asked about a suggestion by a commenter on the Linux Weekly News website, who said, during a discussion on the Google post, “The solution here is simple: just use C++ instead of Rust”, Torvalds could not restrain himself from chortling. “LOL,” was his response. “C++ solves _none_ of the C issues, and only makes things worse. It really is a crap language.
“For people who don’t like C, go to a language that actually offers you something worthwhile. Like languages with memory safety and [which] can avoid some of the dangers of C, or languages that have internal GC [garbage collection] support and make memory management easier. C++ solves all the wrong problems, and anybody who says ‘rewrite the kernel in C++’ is too ignorant to even know that.”
“There is a class of programmer that “clicks” with the OOP concepts of inversion of control, dependency injection, and always coding to an interface. This approach feels so correct to them that they become almost religious in their devotion to it. They believe it is the one-size-fits-all solution to every programming problem, and consider anyone who disagrees with them to simply be less intelligent than themselves (because no rational person could possibly disagree).
I have worked with such people. They quadruple the codebase, significantly increase development time, and introduce more bugs by doing so, all the while insisting that this will make the code easier to maintain in the long run. But it actually doesn’t; it just means that there is that much more code that must be refactored as new features come along.
They point at the ease with which layers can now be swapped-out as a total win. But in practice 90% of the interfaces have only one implementation. There is nothing to switch anything to because there just isn’t any need. And future development isn’t simplified by finding just the right layer to adjust; the changes are almost always cross-cutting so you have to refactor all those interfaces and layers top-to-bottom, each time, to get things working.
It just doesn’t work out in practice the way they insist it will. At least, not often.”
There are multiple GUID generation algorithms, but I’ll pick one of them for concreteness, specifically the version described in this Internet draft.
The first 60 bits of the GUID encode a timestamp, the precise format of which is not important.
The next four bits are always 0001, which identify that this GUID was generated by “algorithm 1”. The version field is necessary to ensure that two GUID generation algorithms do not accidentally generate the same GUID. The algorithms are designed so that a particular algorithm doesn’t generate the same GUID twice, but without a version field, there would be no way to ensure that some other algorithm wouldn’t generate the same GUID by some systematic collision.
The next 14 bits are “emergency uniquifier bits”; we’ll look at them later, because they are the ones that fine tune the overall algorithm.
The next two bits are reserved and fixed at 01.
The last 48 bits are the unique address of the computer’s network card. If the computer does not have a network card, set the top bit and use a random number generator for the other 47. No valid network card will have the top bit set in its address, so there is no possibility that a GUID generated from a computer without a network card will accidentally collide with a GUID generated from a computer with a network card.
Once you take it apart, the bits of the GUID break down like this:
- 60 bits of timestamp,
- 48 bits of computer identifier,
- 14 bits of uniquifier, and
- six bits are fixed,
for a total of 128 bits.
The goal of this algorithm is to use the combination of time and location (“space-time coordinates” for the relativity geeks out there) as the uniqueness key. However, timekeeping is not perfect, so there’s a possibility that, for example, two GUIDs are generated in rapid succession from the same machine, so close to each other in time that the timestamp would be the same. That’s where the uniquifier comes in. When time appears to have stood still (if two requests for a GUID are made in rapid succession) or gone backward (if the system clock is set to a new time earlier than what it was), the uniquifier is incremented so that GUIDs generated from the “second time it was five o’clock” don’t collide with those generated “the first time it was five o’clock”.
Raymond Chen, devblogs.microsoft.com
Have you ever…
- Wasted a lot of time coding the wrong algorithm?
- Used a data structure that was much too complicated?
- Tested a program but missed an obvious problem?
- Spent a day looking for a bug you should have found in five minutes?
- Needed to make a program run three times faster and use less memory?
- Struggled to move a program from a workstation to a PC or vice versa?
- Tried to make a modest change in someone else’s program?
- Rewritten a program because you couldn’t understand it?
Was it fun?
These things happen to programmers all the time. But dealing with such problems is often harder than it should be because topics like testing, debugging, portability, performance, design alternatives, and style—the practice of programming—are not usually the focus of computer science or programming courses. Most programmers learn them haphazardly as their experience grows, and a few never learn them at all.
Kernighan, Brian W.. The Practice of Programming
I thought this was a pretty good intro to running python and VS Code.
Note you want to have Python and VS Code installed. It goes from there.
It is hard to talk about a “chair” if nobody agrees on what a chair is. There is enough of a common example base in OO, the shape, animal, and device-driver examples; that one can start, but beyond that the nature of OO diverges from person to person.
I’ll take that challenge. Find a definition of chair. For any said definition of finite length, their is either an exception to the definition or a thing that is a chair that isn’t covered by the definition. And yet, we can still talk about chairs.
We can only talk about chairs if we first state that we’re only interested in wooden 4-legged chairs.]
I suppose we have beanbag chairs that are borderline “mini-couches”. But, this gets back to the need for a working classification system for OO. I don’t know if “modeling” can be separated from language or not.
The Google Brain team created TensorFlow. Its underlying software powers some of the technologies that Google uses today. It translates languages, improves search engine results, recognizes pictures in Google Photos, and understands spoken words, making its machine learning (ML) capabilities genuinely awe-inspiring.
To the surprise of the tech community, Google open-sourced TensorFlow, making it available to everyone. Developers can create ML models, classes for these models, and write imperative forward passes with it, among others. TensorFlow uses Python, C++, and CUDA.
Brittany Day, linuxsecurity.com
3. monitor / unmonitor
If we want to track when a method is called on our page we can use the monitor() function…
the link was via npm mailing list:
We came across this handy guide from Alex Ritzcovan for the Chrome users out there, including some lesser-known Chrom DevTools utilities you might not be aware of. Here are 5 of their favorite tools provided by the DevTools team.
go to https://www.npmjs.com/ to sign up or browse.
During a campaign event on Monday, U.S. presidential candidate Joe Biden “suggested coal miners could simply learn to code to transition to ‘jobs of the future,'” reports Newsweek:
“Anybody who can go down 300 to 3,000 feet in a mine, sure in hell can learn to program as well, but we don’t think of it that way,” he said… “Anybody who can throw coal into a furnace can learn how to program for God’s sake…”
Many Twitter users criticized Biden’s comments as reductive. “Telling people to find other work without a firm plan to help them succeed will never be popular,” communications professional Frank Lutz wrote… Congressional candidate Brianna Wu tweeted that she was “glad to see the recognition that you don’t need to be in your 20s to do this as a profession,” but also called Biden’s suggestion “tone-deaf and unhelpful.”
Long-time Slashdot reader theodp notes the response this speech got from New York magazine’s Sarah Jones: “Please Stop Telling Miners To Learn To Code.” And in comments on the original submission, at least two Slashdot readers seemed to agree. “Not everyone can code and certainly not every coal miner or coal worker,” wrote Slashdot reader I75BJC. “Vastly different skills.”
Slashdot reader Iwastheone even shared a Fox News article in which rival presidential candidate Andrew Yang argued “Maybe Americans don’t all want to learn how to code… Let them do the kind of work they actually want to do, instead of saying to a group of people that you all need to become coders.”
But is there something elitist in thinking that coal miners couldn’t learn to do what coders learned to do? It seems like an interesting question for discussion