Category: Tech / Science

Clear Introduction to Modern JavaScript – Node, Webpack, Babel, Grunt

https://peterxjang.com/blog/modern-javascript-explained-for-dinosaurs.html

So this is modern JavaScript in a nutshell. We went from plain HTML and JS to using a package manager to automatically download 3rd party packages, a module bundler to create a single script file, a transpiler to use future JavaScript features, and a task runner to automate different parts of the build process. Definitely a lot of moving pieces here, especially for beginners. Web development used to be a great entry point for people new to programming precisely because it was so easy to get up and running; nowadays it can be quite daunting, especially because the various tools tend to change rapidly.

Why is C Preferred for Some Scenarios?


What are common uses of C in the real world outside of embedded and OS dev? from C_Programming

tim36272
Consider expanding what you normally think of as “embedded”: I develop embedded C applications but our box has the same processing power as a gaming computer.

Most safety-critical applications are written in C or Ada.

andai
Most safety-critical applications are written in C

This puzzles me — wasn’t safety one of the main reasons for developing alternatives to C and C++?

tim36272
You’re thinking of things like type safety, garbage collection, etc.

I’m talking about safety in terms of people dying. Things like garbage collection are the opposite of life safety: what if your airplane decided it needed to free up memory ten seconds from touchdown so it ran the garbage collector? What if running the garbage collector caused a valve to respond 0.1 seconds late to a command, which caused a chain reaction resulting in a hydraulic line bursting and losing control of the rudder?

C can be safe because it does exactly what the programmer tells it to do, nothing more and nothing less. There’s no magic going on behind the scenes which could have complex interactions with other behind the scenes magic.

A common example is C++’s std::vector. This container expands as needed to accommodate as many elements as you need. But you have a limited amount of memory on the system, so you need to do static analysis to determine the maximum size of that vector. And you need to be sure that you have enough memory for that plus everything else in your system.

We’ll now you’ve eliminated a lot of the convenience of using std::vector: you might as well just allocate that max size to it and avoid all the overhead std::vector imposes by growing in size.

The other main advantage of std::vector is templates. If you were to use a template in safety critical code you’d need to prove that the code generated by the compiler is correct for every template. We’ll now you’re diving down into all this auto-generated machine code: it would be easier to just write that code yourself and avoid the complexity introduced my the compiler’s template generator.

So, if we’ve eliminated all the usefulness of std::vector, why use it at all?

Repeat that process for most features in most languages and voila! You’re back at C 🙂

Computer Snobs Circa 2006

Linux Snobs, The Real Barriers to Entry

“It’s not Windows. It’s not distro wars. Sometimes it’s just the arrogant attitude that keeps people from switching from Windows. ‘As I spoke to newbies, one Windows user who wanted to learn about Linux shared the encouraging and constructive note (not) he received from one of the project members. The responding note read: “Hi jackass, RTFM and stop wasting our time trying to help you children learn.””

gEvil
Well duh! Of course it’s the arrogant users that are keeping people from trying Linux. That’s precisely the reason why I use a Mac.

elrous0
I managed to escape from that cult, and you can too brother!

Meet me by the fence tonight at 1am. I’ll have a van waiting. We can take you to a place where Father Steve will never find you. There is another life out there for you, trust me!

identity0
Are you kidding me? The Mac community is composed of 30% latte-sipping wannabe ‘artists’, 50% trendsters with too much money, 25% hippies, 4% Hollywood actors, and 1% Steve Jobs. And Steve is the least arrogant one of the bunch.

That’s why I Switched(tm) to OpenBSD, the least arrogant OS community!

I can go up to the head development guy, Theo, and he answers all my questions!! Usually the answer is how evil George Bush and Richard Stallman are, and how stupid I am for being a stupid American that supports stupid people and asks stupid questions because I am stupid. I don’t know how that solves my problems, but at least he answers!!!!

GrAfFiT
One huge difference is that the Microsoft tech support guys are paid to listen to your stupidities. You are a lot more patient and understanding when you’re paid.

Developers Discuss Pros and Cons of Programming Languages – Slashdot

Linus Torvalds Says Rust Closer for Linux Kernel Development, Calls C++ ‘A Crap Language’

Asked about a suggestion by a commenter on the Linux Weekly News website, who said, during a discussion on the Google post, “The solution here is simple: just use C++ instead of Rust”, Torvalds could not restrain himself from chortling. “LOL,” was his response. “C++ solves _none_ of the C issues, and only makes things worse. It really is a crap language.

“For people who don’t like C, go to a language that actually offers you something worthwhile. Like languages with memory safety and [which] can avoid some of the dangers of C, or languages that have internal GC [garbage collection] support and make memory management easier. C++ solves all the wrong problems, and anybody who says ‘rewrite the kernel in C++’ is too ignorant to even know that.”

——
“There is a class of programmer that “clicks” with the OOP concepts of inversion of control, dependency injection, and always coding to an interface. This approach feels so correct to them that they become almost religious in their devotion to it. They believe it is the one-size-fits-all solution to every programming problem, and consider anyone who disagrees with them to simply be less intelligent than themselves (because no rational person could possibly disagree).

I have worked with such people. They quadruple the codebase, significantly increase development time, and introduce more bugs by doing so, all the while insisting that this will make the code easier to maintain in the long run. But it actually doesn’t; it just means that there is that much more code that must be refactored as new features come along.

They point at the ease with which layers can now be swapped-out as a total win. But in practice 90% of the interfaces have only one implementation. There is nothing to switch anything to because there just isn’t any need. And future development isn’t simplified by finding just the right layer to adjust; the changes are almost always cross-cutting so you have to refactor all those interfaces and layers top-to-bottom, each time, to get things working.

It just doesn’t work out in practice the way they insist it will. At least, not often.”

——
“Just use JavaScript, make the kernel run off node.js.”

Slashdot

Software Development Failure Rates

In 2008, IBM reported that 60% of IT projects fail. In 2009, ZDNet reported that 68% of software projects fail.

By 2018, Information Age reported that number had worsened to 71% of software projects being considered failures. Deloitte characterized our failure rate as “appalling.” It warns that 75% of Enterprise Resource Planning projects fail, and Smart Insights reveals that 84% of digital transformation projects fail.

In 2017 Tech Republic reported that big data projects fail 85% of the time.

According to McKinsey, 17% of the time, IT projects go so badly that they threaten the company’s very existence.

Hewitt, Eben. Semantic Software Design

The Urge to Philosophize, Futility of

Boltzmann tried out some of his ideas on Brentano, who had the grace and sometimes the courage to take his philosophical striving seriously. But Boltzmann’s fundamental skepticism was never far from the surface. “Is there any sense at all in breaking one’s head over such things?” he asked Brentano in 1905. “Shouldn’t the irresistible urge to philosophize be compared to the vomiting caused by migraines, in that something is trying to struggle out even though there’s nothing inside?”

Boltzmanns Atom: The Great Debate That Launched A Revolution In Physics
David Lindley

Throw One Away – 2 quotes

The management question, therefore, is not whether to build a pilot system and throw it away. You will do that. […] Hence plan to throw one away; you will, anyhow.
Fred Brooks
The Mythical Man-Month: Essays on Software Engineering
Wikiquote

Frears is carefully and patiently teasing out the power and subtlety in Shashi by getting him to act simply and underplay everything. You can see the performance developing take by take. After eight or nine takes Shashi is settled, a little tired and bored, more casual and relaxed. Now he is able to throw the scene away. And this is when he is at his best, though he himself prefers the first few takes when he considers himself to be really ‘acting’. Sometimes he can’t see why Frears wants to do so many retakes.
Hanif Kureishi
Sammy and Rosie Get Laid Screenwriters Diary

Seeing Your Life Onscreen – Aaron Sorkin Discusses The Social Network – Deadline Interview

DEADLINE: We know Mark Zuckerberg didn’t cooperate but did you ever meet Eduardo Saverin, the character played by Andrew Garfield?
SORKIN: Once Eduardo signed that non-disclosure agreement after his settlement, he disappeared off the face of the earth. We don’t know exactly how much he received, but it’s in the hundreds of millions. And it will probably go over a billion because he also does now own a lot of Facebook stock. But on October 1st, the movie opened and that’s the day I met Eduardo. I got a phone call from our producer Scott Rudin that a representative for Eduardo had contacted him late at night. He wanted to see the movie. So we set up a private screening for him in New York right before Lady Gaga’s private screening. It’s true. I went to meet him when the movie was over and you could have performed surgery on him without anesthesia at that point in time. I gotta say, he was a deer in the headlights which is an understatement. He did certainly expect to like the movie a lot, but you could tell in his face that he had just relived the thing. It’s an unreasonable experience that hardly anybody, including myself, knows what it’s like to have a chapter from your life suddenly written, directed, lit, shot, and performed by actors. That was the first and only time I met Eduardo.

Deadline

Pioneering Computers, List of

Altair 8800. The pioneering microcomputer that galvanized hardware hackers. Building this kit made you learn hacking. Then you tried to figure out what to do with it.

Apple II. Steve Wozniak’s friendly, flaky, good-looking computer, wildly successful and the spark and soul of a thriving industry.

Atari 800. This home computer gave great graphics to game hackers like John Harris, though the company that made it was loath to tell you how it worked.

IBM PC. IBM’s entry into the personal computer market, which amazingly included a bit of the Hacker Ethic and took over.

IBM 704. IBM was The Enemy and this was its machine, the Hulking Giant computer in MIT’s Building 26. Later modified into the IBM 709, then the IBM 7090. Batch-processed and intolerable.

LISP Machine. The ultimate hacker computer, invented mostly by Greenblatt and subject of a bitter dispute at MIT.

PDP-1. Digital Equipment’s first minicomputer and in 1961 an interactive godsend to the MIT hackers and a slap in the face to IBM fascism.

PDP-6. Designed in part by Kotok, this mainframe computer was the cornerstone of the AI lab, with its gorgeous instruction set and sixteen sexy registers.

Sol Computer. Lee Felsenstein’s terminal-and-computer, built in two frantic months, almost the computer that turned things around. Almost wasn’t enough.

Tom Swift Terminal. Lee Felsenstein’s legendary, never-to-be-built computer terminal, which would give the user ultimate leave to get his hands on the world.

TX-0. Filled a small room, but in the late fifties, this $3 million machine was world’s first personal computer—for the community of MIT hackers that formed around it.

Hackers: Heroes of the Computer Revolution – 25th Anniversary Edition
Steven Levy

The Algorithm Did It

Stanford Hospital staff protesting the decision by higher ups to give vaccines to admins at home from r/pics

lucynyu13
“There is an enormous demonstration going on at Stanford Hospital right now carried out by staff, who are protesting the decision by higher ups to give vaccines to some administrators and physicians who are at home and not in contact with patients INSTEAD of frontline workers.” Twitter

Only Seven of Stanford’s First 5,000 Vaccines Were Designated for Medical Residents. Stanford Medicine officials relied on a faulty algorithm to determine who should get vaccinated first, and it prioritized some high-ranking doctors over patient-facing medical residents.

tristanjones
Algorithm issue my ass.

You prioritized age, and in so seniority, and didn’t prioritize Frontline work, and depriotize those working from home.

It doesn’t matter if you asked a computer to then run the numbers, you set the rules.

The sentence should be “Hospital administration did not prioritize front line workers but instead accounted for seniority in distributing the vaccine. As a result only 7 of the first 5,000 vaccines for staff will go to Frontline workers. These results were accepted without further scrutiny or adjustments by the administrators incharge of doing so.”

neotropic9
“The algorithm did it” is increasingly an excuse used for shitty management decisions.

GreatBallsOfFIRE
Yup. Algorithms are created by people. The correct phrasing is “the algorithm was written to do it.”