Meanwhile, the shitty tech is winning. According to this author, bad code and bad tools and bad frameworks are reigning supreme over real engineering. Why?
"The situation is really bad for the industry." Also, why?
People with the mentality like the author don't actually want to build things. They just want to sit and complain. Their sense of righteousness and victim mentality gives them more pleasure and validation than actually engaging with the "modern" tech world.
Some other articles by the same:
- "Using a framework can make you stupid!"
- "So-called modern web developers are the culprits"
- "One sure way to determine if you are stupid"
- "SQLite the only database you will ever need in most cases"
- "No, your website is not a web app even if you call it so"
Why? The (ultimately unsuccessful) quest for the silver bullet. Nobody wants programming, they want programs - so anything that promises to deliver programs faster looks like a holy grail. Inevitably, though, the promise boils down to a pre-packaged implementation of an existing approach that does something relatively specific, with some options for customization. If you want to step outside that customization, you not only have the assumptions baked into the new "silver bullet", you also have to understand all the nuances of the layers upon layers of other approaches (and all of their assumptions), to the point where it would be faster to just shed all the layers and do it yourself (but you can't because noooo, you're a dinosaur, you don't understand anything, it's the future, it's the modern way of doing things).
Seriously though, there is an equilibrium somewhere, and on top of that a lot of stuff that is more about business and work than programming.
I may be getting out of my depth here but, am I wrong to say that a good deal as to why webdev in Java is even a thing is because there were people around who knew Java to begin with? That a good deal of people learning PHP did it just because Wordpress is a thing? That react native for desktop is about people with expertise in react native being able to develop for desktop?
Workers want paychecks. Society wants products. Workers will learn to make products so they earn paychecks and if they can spend less time learning they will, obviously, do so.
I notice that this mentality isn’t as common in circles of people with their nose to the grindstone building things.
Right now I have some game modding Discord servers in mind where everyone has crappy tools but is focused on building things regardless, and complaining about the state of things just feels trite, as if the entire ecosystem is responding “Ok, in Utopia things are better, but what are you building today?”
For some reason people confuse sour grapes with some sort of high brow eureka they need to dump onto the world.
I agree with you. I agree with the author. I put my own idea into practice:
I'm not a fan of it. I understand the author's complaints and frustrations (I am also "chronologically-challenged"), but I have found that complaining doesn't really buy me much. I'll do a minor whine, from time to time (usually about the ageism in tech), but, for the most part, I try to bring some light to the conversation.
I write native (Swift) apps for Apple systems. I tend to avoid the "bleeding edge," as that isn't really where Ship lives, but I think that the way I do things, results in very good software (note that I didn't say "better software," as comparing against others only pisses people off, and doesn't buy me anything -unlike a lot of folks, I'm not in competition with anyone).
Of course, that doesn't prevent others from sneering at me, but that's their burden. I'm not exactly sure what benefit it gives them, as I am no threat.
If people like the way I do things, and want to improve that, then I am often fairly helpful, but I am not really into standing in the street, yelling at passing cars. I have better things to do.
Swift is only 7 years old, one of the newest languages/tools in existence. This is an implicit argument against the author's case.
One might perhaps argue that Swift's newness/modernness is an exception, that it's "one of the good ones" when it comes to new systems. But this doesn't really work. Using Swift for native GUI app building is platform-specific, and writing native apps for other platforms requires other tools. If Swift represents an improvement on what came before, it implies that such improvements are needed on other platforms also. (If it doesn't represent an improvement, why are you using it?) This also implies that cross-platform solutions might be useful - like Electron or React Native!
Being chronologically challenged doesn't actually prevent one from understanding the driving forces behind modern software technology. Certainly, fads and cargo cults exist, driven in part by people's inevitably incomplete understanding and desire to follow practices that others seem to be using successfully. But to be able to distinguish between the fads and the useful advancements requires a better understanding than the OP exhibits.
Complaining seems like a perfectly fine activity to me if there's some valid content in it. But it becomes fairly useless otherwise, except possibly as you say for driving social media engagement.
New is not necessarily bad. Old is not necessarily good. It all depends on what we are doing, and what our goals are.
I also use PHP to write my backends. It works nicely, for the scope of my needs. I’ve been writing in that, for over twenty years. My servers work quite well, but aren’t particularly “buzzword-compliant.”
I’m not really a server programmer, though. If I need a backend to be more intense than what I’m capable of doing, then I’ll hire someone to write it, using a more robust stack, and I need to be prepared to open my wallet, as I have high standards. Good engineers will always cost more. They’re worth it. They will frequently also have gray hair; regardless of the tech.
The funny thing is, 20 years ago the same arguments were being made.
"Windows MFC is a nightmare to write against!"
"X11 is terrible, why are we using it?"
"Everything we write is built on so many abstractions developers don't know what they are doing."
"VB6 is dumbing down development"
"Developers writing in Java don't understand what is really going on."
"Java is too slow!"
OK the last one was true. UIs in Java were obscenely memory intensive relative to computers back when Java was first introduced.
But yeah, more things change, the more the complaints stay the same.
Then Java developers heard this, confused it for "Java executes slowly" and so they never fixed it, until Java applets just became obsolete.
"Amazing new IDE has just come out! Good luck running it!"
Also local Swing apps were memory hogs compared for the time. 20MB was a lot on a machine that had 256 or 512MB in total.
Which ones do you dedicate your precious time to as a direction to keep earning a paycheck?
I'm pretty convinced my future grandchildren will be maintaining legacy LAMP systems.
Most of these solutions that the author complains about came about to solve real problems, and without addressing how we can continue to solve those real problems with a simpler set of solutions, it's really just noise.
The bad side is because passionate young people come in hordes and they will gladly program anything for a penny. For the company is cheaper to hire a team of those youngsters and let them make your program in every (ugly) way they can, rather than pay a fortune for a veteran programmer that takes their time to deal with code that manually handles memory.
The good side is that you no longer need a computer degree or 20+ years into programming to make something decent.
Because in the time that OP spends tracking down segfaults and getting a dozen native libraries to compile for his native app, the JS dev has already finished XYZ feature. And because the environment is so high level, he can use the extra brain capacity for things like UX, business logic, and accessibility (things people actually care about) rather than meaningless implementation details.
This stuff won out for a reason. HTML/CSS/JS is literally a 10x improvement in iteration speed versus native.
QML gives you JS for the view layer (and some logic if you desire), and a declarative way to lay out your UI. I think it's worth a comparison as a reasonable way to quickly iterate on a UI while having something very lean and portable as the output product.
There seems to be a dearth of actually open, native, and cross-platform GUI toolkits.
When you need to release your application across multiple platforms? Yes. Not much else even begins to compare on that front.
Compared to native development on a single platform? Not even close. HTML/CSS/JS development is painfully slow in comparison.
I find it's often not, largely because the amount of focus on web apps means the native frameworks often are less productive by comparison than what is available for the web.
How? Do you send pull requests to thousands of projects to delete their codebase?
He is complaining about unnecessary churn and unnecessary software and you obviously don't reduce it by writing even more stuff.
Nothing kills software as effectively as creating alternative software that works way better.
Would it have anything to do with coding boot camps teaching people how to code specific to these frameworks vs deeper learning into broader coding?
: [Electron apps] constantly crash and has no value over a native desktop application what so ever
All those examples could or do work just as well as web apps, right? I don't think that's the authors message, but I'd rather have those type of apps be web-based instead of electron-based or native.
Say for example I used a batch of 100 HN accounts to mass downvote your comment. After a while an hourly job runs that looks at all those votes in aggregate and determines they were coordinated (for simplicity sake they all came from the same IP) and removes them. You'd see a large shift in the net score of your comment all of the sudden. This isn't HN manipulating anything, it is them doing their job to prevent abuse on the platform.
I think that unintentionally removes down votes on obviously bad videos that people do not waste much time watching.
For instance, an hour-long video claiming to be what you're looking for, but actually consisting of a still image and a URL to a scam/spam site.
I've seen dislike attacks happen. And if it takes a non-trivial procedure to make them count, people will document that procedure. For instance, hypothetically:
"Alright everybody, here's the link. Remember, mute the tab right away but don't mute the video itself; wait 2 minutes (the timer that starts when you click the link will go green to let you know), then click the dislike button."
I've seen much more complex instructions offered as part of gaming a poll, as well as sites built to help semi-automate or simplify the process.
The key distinction that would be useful for YouTube to measure: did you encounter the video and then dislike it, or did you visit a video you were referred to for the sole purpose of disliking it? I don't think hiding dislike counts serves that purpose, though.
"This world needs a lot more new plumbers than new liberal arts majors"
This is a completely subjective and politically charged statement. I think the world needs more informed and educated critical thinkers. Its curious how the 90s led to a collapse in liberal arts departments and enrollments, and 20 years later we're now in an era of disinformation at massive scale.
Tony Blair's university pledge has failed, says his son
"Tony Blair's landmark target for half of school leavers to attend university is no longer fit for purpose, his son has said – claiming that the former prime minister agrees with him [..] critics have argued that it encouraged an unhealthy focus on higher education which led to a proliferation of pointless "Mickey Mouse" degrees and lured undergraduates into racking up large debts without a meaningful increase in their salary prospects."
In my opinion, if we listened to more socieologists instead of the Zuckerbergs and Bezoses of the world, we’d be much better off as a country.
Also, not a lot of actual "science" going on in Computer Science departments (meaning following the scientific method in the course of conducting research, as opposed to "here is a thing I built, isn't it neat?"), but that doesn't stop tech-inclined people from joining in droves. So I'm not sure what the "no science going on" criticism means. It's not like the scientific process is used when deploying technology solutions to problems. We just say "Here's a thing! Let's distribute this to as many people as possible!" without any forethought into what consequences will be.
To that last point, from my recollection of the late aughts, early 10s, it was sociologists who first identified that maybe Facebook and Instagram would be bad for society. Turns out they were right. Again.
People are rooting for Sweeney due to his alpha nerd rage righteous more than how “good”Epic is to their customers
If it's not, then I don't see the contradiction here. Leaving aside people who make issues like this part of their identity, you can root for Epic in this specific case without deciding that you love Epic, with the focus on the impact of _this case_ on the industry. Just as you can root for Google in Google v Oracle without becoming a superfan of Google's.
If I'm painting a purple sphere, then the cast shadow will be a much darker value of purple.
People really seem to be hung up on equivocating a sitting president attempting to overturn an election with people on twitter saying mean things.
The FBI concluded in Aug. 2021 that there was no insurrection attempt.
Note that Pelosi is still trying to suppress why she didn't provide adequate Capitol security as requested for 100,000 attendees.
Location: New York City
Remote: Yes, but prefer companies with a local presence
Willing to relocate: No
Technologies: Python, Airflow, Flink, Java, SQL, K8s, Spark, Hadoop, Kafka
Résumé/CV: Work in progress! Just beginning my search
Email: In my profile
The "email" field in your profile isn't public. You need to put it in the "about" field if you want it to be public. That's why users have been replying to you in the thread here.
This has been a common misunderstanding in the past so we actually added some clarifying text to the profile page a few weeks ago! https://news.ycombinator.com/item?id=28016257
Feel free to send me an email at saunghee at deliveryrelay.com to learn more!
The role is here: https://courted.breezy.hr/p/8fe5f20cfb72-senior-data-enginee...
Right now it really feels like this isn't a priority. Which is fine.
But if it is a priority that other people replicate your work, I'm not sure you're making this as easy for people as possible.
If you could leverage AI to generate digital art based on real artist/photographer inputs, perhaps you could create a nice little marketplace business.. or maybe just a simple AI generator plugin for an existing marketplace..
Having said that, I'm curious and somewhat excited to see how these will evolve. As I said, I find them beautiful. As a painter myself there is nothing out there that will make me not paint. Sure, I sometimes use tools but there's always the me in there who is in control or driven by my human instinct.
Generative art can be a wildly creative process on par with anything a painter works through. It's just a different medium.
The way you express yourself through art is not threatened by people choosing other ways using different tools. Painting did not become obsolete because someone invented art photography.
I do agree with the sentiment about NFT 'artists' though. Copy pasting a colab notebook, replacing a string and selling the result as NFT is just idiotic.
I wonder who the bigger fool is. The one who sells or the one who buys.
Wired article about the demo scene from 1995
I could see AI "art" competing on price with stock pictures and cheap illustrations for throwaway/placeholder usecases, or when uniqueness is preferred.
It's already happening with social media profile pictures for instances. Next up could be the filler artworks in hotel rooms.