Hacker News new | past | comments | ask | show | jobs | submit | prions's comments login

It will end when people like the author drop their victim mentality and start putting their own ideas into practice. It seems that this person is content with yelling on the sidelines about how good and efficient things used to be.

Meanwhile, the shitty tech is winning. According to this author, bad code and bad tools and bad frameworks are reigning supreme over real engineering. Why?

"The situation is really bad for the industry." Also, why?

People with the mentality like the author don't actually want to build things. They just want to sit and complain. Their sense of righteousness and victim mentality gives them more pleasure and validation than actually engaging with the "modern" tech world.

Some other articles by the same:

- "Using a framework can make you stupid!"

- "So-called modern web developers are the culprits"

- "One sure way to determine if you are stupid"

- "SQLite the only database you will ever need in most cases"

- "No, your website is not a web app even if you call it so"


> bad code and bad tools and bad frameworks are reigning supreme over real engineering. Why?

Why? The (ultimately unsuccessful) quest for the silver bullet. Nobody wants programming, they want programs - so anything that promises to deliver programs faster looks like a holy grail. Inevitably, though, the promise boils down to a pre-packaged implementation of an existing approach that does something relatively specific, with some options for customization. If you want to step outside that customization, you not only have the assumptions baked into the new "silver bullet", you also have to understand all the nuances of the layers upon layers of other approaches (and all of their assumptions), to the point where it would be faster to just shed all the layers and do it yourself (but you can't because noooo, you're a dinosaur, you don't understand anything, it's the future, it's the modern way of doing things).


Which is why we ought to be building webs with Assembly.

Seriously though, there is an equilibrium somewhere, and on top of that a lot of stuff that is more about business and work than programming.

I may be getting out of my depth here but, am I wrong to say that a good deal as to why webdev in Java is even a thing is because there were people around who knew Java to begin with? That a good deal of people learning PHP did it just because Wordpress is a thing? That react native for desktop is about people with expertise in react native being able to develop for desktop?

Workers want paychecks. Society wants products. Workers will learn to make products so they earn paychecks and if they can spend less time learning they will, obviously, do so.


It’s refreshing to see this as the top comment on a forum where we are often guilty of this mentality.

I notice that this mentality isn’t as common in circles of people with their nose to the grindstone building things.

Right now I have some game modding Discord servers in mind where everyone has crappy tools but is focused on building things regardless, and complaining about the state of things just feels trite, as if the entire ecosystem is responding “Ok, in Utopia things are better, but what are you building today?”

For some reason people confuse sour grapes with some sort of high brow eureka they need to dump onto the world.


> The browsers native language is HTML. HTML is a markup language and if you feed HTML to the browser it will very quickly render the page with what you give it. But no. Let's not do that. Let's instead feed the browser with JSON and then have the build in JavaScript engine translate that into a pre-formatted output which then gets translated into HTML before it's served to the user.

I agree with you. I agree with the author. I put my own idea into practice:

https://htmx.org


Internet complaining gets eyeballs. Look at this comment thread. Look at the position in the HN queue.

I'm not a fan of it. I understand the author's complaints and frustrations (I am also "chronologically-challenged"), but I have found that complaining doesn't really buy me much. I'll do a minor whine, from time to time (usually about the ageism in tech), but, for the most part, I try to bring some light to the conversation.

I write native (Swift) apps for Apple systems. I tend to avoid the "bleeding edge," as that isn't really where Ship lives, but I think that the way I do things, results in very good software (note that I didn't say "better software," as comparing against others only pisses people off, and doesn't buy me anything -unlike a lot of folks, I'm not in competition with anyone).

Of course, that doesn't prevent others from sneering at me, but that's their burden. I'm not exactly sure what benefit it gives them, as I am no threat.

If people like the way I do things, and want to improve that, then I am often fairly helpful, but I am not really into standing in the street, yelling at passing cars. I have better things to do.


> I write native (Swift) apps for Apple systems.

Swift is only 7 years old, one of the newest languages/tools in existence. This is an implicit argument against the author's case.

One might perhaps argue that Swift's newness/modernness is an exception, that it's "one of the good ones" when it comes to new systems. But this doesn't really work. Using Swift for native GUI app building is platform-specific, and writing native apps for other platforms requires other tools. If Swift represents an improvement on what came before, it implies that such improvements are needed on other platforms also. (If it doesn't represent an improvement, why are you using it?) This also implies that cross-platform solutions might be useful - like Electron or React Native!

Being chronologically challenged doesn't actually prevent one from understanding the driving forces behind modern software technology. Certainly, fads and cargo cults exist, driven in part by people's inevitably incomplete understanding and desire to follow practices that others seem to be using successfully. But to be able to distinguish between the fads and the useful advancements requires a better understanding than the OP exhibits.

Complaining seems like a perfectly fine activity to me if there's some valid content in it. But it becomes fairly useless otherwise, except possibly as you say for driving social media engagement.


Exactly.

New is not necessarily bad. Old is not necessarily good. It all depends on what we are doing, and what our goals are.

I also use PHP to write my backends. It works nicely, for the scope of my needs. I’ve been writing in that, for over twenty years. My servers work quite well, but aren’t particularly “buzzword-compliant.”

I’m not really a server programmer, though. If I need a backend to be more intense than what I’m capable of doing, then I’ll hire someone to write it, using a more robust stack, and I need to be prepared to open my wallet, as I have high standards. Good engineers will always cost more. They’re worth it. They will frequently also have gray hair; regardless of the tech.


> Meanwhile, the shitty tech is winning. According to this author, bad code and bad tools and bad frameworks are reigning supreme over real engineering. Why?

The funny thing is, 20 years ago the same arguments were being made.

"Windows MFC is a nightmare to write against!"

"X11 is terrible, why are we using it?"

"Everything we write is built on so many abstractions developers don't know what they are doing."

"VB6 is dumbing down development"

"Developers writing in Java don't understand what is really going on."

"Java is too slow!"

OK the last one was true. UIs in Java were obscenely memory intensive relative to computers back when Java was first introduced.

But yeah, more things change, the more the complaints stay the same.


All of them are still true, except for the last one (for nearly all use cases).

Java got its slow reputation mainly because any early encounter with a Java applet would proudly display the Java logo and then cause your computer and disk to thrash for 30 seconds.

Then Java developers heard this, confused it for "Java executes slowly" and so they never fixed it, until Java applets just became obsolete.


Eclipse didn't help Java's reputation any.

"Amazing new IDE has just come out! Good luck running it!"

Also local Swing apps were memory hogs compared for the time. 20MB was a lot on a machine that had 256 or 512MB in total.


Alternately, it becomes impossible, by analysis paralysis, to decide what stack to learn and build from. Which one is safe? Which one is secure? Which one has legs?

Which ones do you dedicate your precious time to as a direction to keep earning a paycheck?


>Which ones do you dedicate your precious time to as a direction to keep earning a paycheck?

LAMP.

I'm pretty convinced my future grandchildren will be maintaining legacy LAMP systems.


There are also great projects like Hotwire that are gaining momentum that seek to actually address some of these problems of never ending layers of complexity. I think more constructive posts advocating for things like that would be more productive than the 1 millionth "SPAs are too complicated" blog post.

Most of these solutions that the author complains about came about to solve real problems, and without addressing how we can continue to solve those real problems with a simpler set of solutions, it's really just noise.


Veterans then to gloss over the fact that most "evolution" in programming revolves around programmer experience, not around the use of resources.

The bad side is because passionate young people come in hordes and they will gladly program anything for a penny. For the company is cheaper to hire a team of those youngsters and let them make your program in every (ugly) way they can, rather than pay a fortune for a veteran programmer that takes their time to deal with code that manually handles memory.

The good side is that you no longer need a computer degree or 20+ years into programming to make something decent.


>Meanwhile, the shitty tech is winning. According to this author, bad code and bad tools and bad frameworks are reigning supreme over real engineering. Why?

Because in the time that OP spends tracking down segfaults and getting a dozen native libraries to compile for his native app, the JS dev has already finished XYZ feature. And because the environment is so high level, he can use the extra brain capacity for things like UX, business logic, and accessibility (things people actually care about) rather than meaningless implementation details.

This stuff won out for a reason. HTML/CSS/JS is literally a 10x improvement in iteration speed versus native.


I've been out of the GUI scene for a long time, but I will say I've been hella-impressed with some of the more modern Qt tech.

QML gives you JS for the view layer (and some logic if you desire), and a declarative way to lay out your UI. I think it's worth a comparison as a reasonable way to quickly iterate on a UI while having something very lean and portable as the output product.


Wait, isn't Qt proprietary? I feel like that kind of vendor lock-in alone disqualifies it for a lot of use cases...

There seems to be a dearth of actually open, native, and cross-platform GUI toolkits.


Qt is a mix of LGPL for the core, and GPL for some of the "value-add" modules (for which there's a commercial license available for those who can't live with GPL requirements).

> HTML/CSS/JS is literally a 10x improvement in iteration speed versus native.

When you need to release your application across multiple platforms? Yes. Not much else even begins to compare on that front.

Compared to native development on a single platform? Not even close. HTML/CSS/JS development is painfully slow in comparison.


> Compared to native development on a single platform? Not even close. HTML/CSS/JS development is painfully slow in comparison.

I find it's often not, largely because the amount of focus on web apps means the native frameworks often are less productive by comparison than what is available for the web.


I'm still waiting for anything in HTML/JS land that's as productive as WPF.

i don't see why i'd ever spend time on any particular platform when i could avoid it. I don't like any of them, but i still want my software to work on them.

It didn't win for that reason.

> drop their victim mentality and start putting their own ideas into practice

How? Do you send pull requests to thousands of projects to delete their codebase?


Invest into your own super wonderful alternative and they'll crumble at mere sight of your engineering superiority.

Or not.


Missing the point a bit? The author is in no way complaining about the lack of choice in web frameworks and similarly hyped up stuff.

He is complaining about unnecessary churn and unnecessary software and you obviously don't reduce it by writing even more stuff.


You can reduce abundance of software by writing more software.

Nothing kills software as effectively as creating alternative software that works way better.


Citation needed.

>Meanwhile, the shitty tech is winning. According to this author, bad code and bad tools and bad frameworks are reigning supreme over real engineering. Why?

Would it have anything to do with coding boot camps teaching people how to code specific to these frameworks vs deeper learning into broader coding?


Dead on. Building efficient and elegant software is difficult. The author should actually give it a try some time.

Yep. When I saw the tired "Electron = bad" rant I immediately went to look for a tab entitled "Projects" to see if the author had created a native desktop app with "value"[1] equal to or greater than what a good Electron app provides (Slack, Spotify, Discord, VS Code). But lo and behold, no such tab.

[1]: [Electron apps] constantly crash and has no value over a native desktop application what so ever


> Slack, Spotify, Discord

All those examples could or do work just as well as web apps, right? I don't think that's the authors message, but I'd rather have those type of apps be web-based instead of electron-based or native.


Electron apps are pretty much that tho. Just with extra access to the system that you wouldn't want to give to every web page.

They are web apps... open.spotify.com, discord.com, app.slack.com. Even vscode.dev. The desktop product is just a repackaging of the web app with perhaps more OS integrations. If you don't want the OS integrations, use the web app. Some are available as a PWA too, which provides a nice middle ground.

They all have artificial restrictions in the web versions which makes them worse to encourage people to install the apps (actually not sure if discord does this, I've only used it a little bit). For spotify and slack there is definitely an active push to make their web apps worse than they need to be.

At least for VSCode, it's the other way around - the app was first, and even if it was always an Electron app, it was still designed and written for the desktop. It actually took some time and effort to port that to vscode.dev, which wasn't there at the beginning.

VS Code started as the Monaco editor, which was (and still is) first and foremost a code editor for web.

Just taking a look at his youtube channel. Why is he so hot on culture war issues now? It seems that right to repair has taken a back seat to his polemics about covid lockdowns, vaccine skepticism, and homeless people.


He is a business owner in New York City so Covid lockdowns, vaccine mandates, local government bureaucracy, and the real estate market are affecting him personally. He seems to view his channel as a platform for his varied interests that often overlap instead of an algorithm-maximizing focus on purely Macbook board repair.


I wish Youtube had a "Culture War" button. It would be wonderful to have an easy way to steer clear of culture war bullshit. Not just for YouTube, the whole internet.


So you'd rather they manipulate the data and potentially mislead users? Sounds like a slippery slope to me \s


YouTube has been manipulating like/dislike data for the White House for months now [1].

[1]: https://81m.org/

Archive: https://web.archive.org/web/20211023061808/https://81m.org/


It looks like the creator is counting any decrease in dislikes over time as manipulation, without a firm understanding of how large scale systems work.

Say for example I used a batch of 100 HN accounts to mass downvote your comment. After a while an hourly job runs that looks at all those votes in aggregate and determines they were coordinated (for simplicity sake they all came from the same IP) and removes them. You'd see a large shift in the net score of your comment all of the sudden. This isn't HN manipulating anything, it is them doing their job to prevent abuse on the platform.


It's interesting data, but the obvious response from YouTube would be "we run sophisticated click fraud detection algorithms and periodically remove interactions determined to be from fraudulent accounts; given the current political climate White House videos attract more of these types of fraud than other videos on our platform and thus the effect is more pronounced on their videos". The numbers in question are small enough (~1k missing interactions) that it doesn't seem totally unreasonable.


No I'd rather have a know established system/metric in place that amortizes brigading so as to diffuse the effects of mob attacks on internet content.


I don't even buy YouTube's claim that "brigading" or "dislike attacks" is a real thing or a problem. If they have a minimum number of minutes that a video needs to be watched before a vote is counted, then that vote is legitimate, full stop. YouTube are simply unhappy about which videos users dislike, whether it's the White House's videos, important brands, or YT's own videos.


> If they have a minimum number of minutes that a video needs to be watched before a vote is counted

I think that unintentionally removes down votes on obviously bad videos that people do not waste much time watching.


Agreed. Sometimes you know within a couple of seconds that a video isn't what the title and thumbnail claims it is, and that's a great reason to hit "dislike".

For instance, an hour-long video claiming to be what you're looking for, but actually consisting of a still image and a URL to a scam/spam site.


> I don't even buy YouTube's claim that "brigading" or "dislike attacks" is a real thing or a problem. If they have a minimum number of minutes that a video needs to be watched before a vote is counted, then that vote is legitimate, full stop.

I've seen dislike attacks happen. And if it takes a non-trivial procedure to make them count, people will document that procedure. For instance, hypothetically:

"Alright everybody, here's the link. Remember, mute the tab right away but don't mute the video itself; wait 2 minutes (the timer that starts when you click the link will go green to let you know), then click the dislike button."

I've seen much more complex instructions offered as part of gaming a poll, as well as sites built to help semi-automate or simplify the process.

The key distinction that would be useful for YouTube to measure: did you encounter the video and then dislike it, or did you visit a video you were referred to for the sole purpose of disliking it? I don't think hiding dislike counts serves that purpose, though.


I like that solution even more.


Isn't that what Reddit is doing to quite some success?


Look at some objective measurements segmented by race, such as earnings or life expectancy, and revisit whether you think race has "suddenly" mattered


Youre underestimating the value of a college degree. Research like [1] shows that degree holders have significantly more lifetime earnings than high school. You also miss how the democratization of education has benefited our society and helped lift millions out of subsistence lifestyles and class lock-in.

"This world needs a lot more new plumbers than new liberal arts majors"

This is a completely subjective and politically charged statement. I think the world needs more informed and educated critical thinkers. Its curious how the 90s led to a collapse in liberal arts departments and enrollments, and 20 years later we're now in an era of disinformation at massive scale.

1. https://www.ssa.gov/policy/docs/research-summaries/education...


> You[']re underestimating the value of a college degree

Tony Blair's university pledge has failed, says his son

"Tony Blair's landmark target for half of school leavers to attend university is no longer fit for purpose, his son has said – claiming that the former prime minister agrees with him [..] critics have argued that it encouraged an unhealthy focus on higher education which led to a proliferation of pointless "Mickey Mouse" degrees and lured undergraduates into racking up large debts without a meaningful increase in their salary prospects."


If this is the problem in the UK where education is cheap, it is even more of a problem in the USA.


But this doesn't adjust for intelligence. If employers are just using the degree as a proxy for a base level of intelligence (and they really are, also class) then you would see tremendous value by this measure. But really that would be 100% waste, because you could do the same thing in an hour with an IQ test that is currently being done with 4 years of college. The tech industry basically does this now with algorithm problems. Google DGAF if you have a degree if you can memorize a bunch of algorithms.


Many in the tech community turn exclusively to technology to solve every problem. It doesn’t occur to such people that e.g. sociologists, who study problems of society as a discipline, have any answers whatsoever to our problems as a society. To them, sociology is just another useless liberal arts degree without job prospects.

In my opinion, if we listened to more socieologists instead of the Zuckerbergs and Bezoses of the world, we’d be much better off as a country.


But I did listen to sociologists when I took a distribution requirement in college. I'll never do it again. It was 80% totally obvious facts about human behavior that don't need to be taught and 20% totally contrived BS. No science going on there.


So you took 1 intro class into a subject and determined the entire field is worthless? Do you apply this decision-making process to other aspects of your life?

Also, not a lot of actual "science" going on in Computer Science departments (meaning following the scientific method in the course of conducting research, as opposed to "here is a thing I built, isn't it neat?"), but that doesn't stop tech-inclined people from joining in droves. So I'm not sure what the "no science going on" criticism means. It's not like the scientific process is used when deploying technology solutions to problems. We just say "Here's a thing! Let's distribute this to as many people as possible!" without any forethought into what consequences will be.

To that last point, from my recollection of the late aughts, early 10s, it was sociologists who first identified that maybe Facebook and Instagram would be bad for society. Turns out they were right. Again.


Because life under Epics vision of undercutting and delaying releases to Steam is so much more egalitarian and noble

People are rooting for Sweeney due to his alpha nerd rage righteous more than how “good”Epic is to their customers


I dokt fully understand the dynamics here, but are those complaints independent of the issue being fought out between Apple and Epic? Ie, couldn't Epic still be engaging in "undercutting and delaying releases" whether or not they're paying 30% to Apple, or is that behavior somehow tangled up in this issue?

If it's not, then I don't see the contradiction here. Leaving aside people who make issues like this part of their identity, you can root for Epic in this specific case without deciding that you love Epic, with the focus on the impact of _this case_ on the industry. Just as you can root for Google in Google v Oracle without becoming a superfan of Google's.


(amateur painter) Generally, shadows have a lower chroma and value than the color of the object.

If I'm painting a purple sphere, then the cast shadow will be a much darker value of purple.


It's really easy to take a both-sides stance here, but the last presidency (ending in that same president causing an insurrection attempt at our capital) shows that one side is clearly more dedicated to authoritarianism than the other - socially, politically, and legally.

People really seem to be hung up on equivocating a sitting president attempting to overturn an election with people on twitter saying mean things.


TBF the Obama administration set some nasty precedents internally (whistleblower prosecutions under espionage act) and externally (assassinations), plus plenty of similar precedents set by previous administrations both D and R


I don't consider my framing to be a "both-sides" false equivalency — I simply did omitted labels, because I felt that was rhetorically stronger. The framing is entirely compatible with the possibility that one side exhibits much more severe authoritarian tendencies than the other.


> president causing an insurrection attempt at our capital

The FBI concluded in Aug. 2021 that there was no insurrection attempt.

https://www.reuters.com/world/us/exclusive-fbi-finds-scant-e...

Note that Pelosi is still trying to suppress why she didn't provide adequate Capitol security as requested for 100,000 attendees.


Senior Data Engineer. Interested in DE/ML Eng jobs but open to exploring other work.

  Location: New York City
  Remote: Yes, but prefer companies with a local presence
  Willing to relocate: No
  Technologies: Python, Airflow, Flink, Java, SQL, K8s, Spark, Hadoop, Kafka
  Résumé/CV: Work in progress! Just beginning my search
  Email: In my profile


> Email: In my profile

The "email" field in your profile isn't public. You need to put it in the "about" field if you want it to be public. That's why users have been replying to you in the thread here.

This has been a common misunderstanding in the past so we actually added some clarifying text to the profile page a few weeks ago! https://news.ycombinator.com/item?id=28016257


Thanks!


Hi there! My company is hiring a Senior Data Engineer, and I think you'd be a fantastic fit. Here is the link to the posting: https://jobs.lever.co/deliveryrelay/3fc7cde8-4dea-4151-bc67-...

Feel free to send me an email at saunghee at deliveryrelay.com to learn more!


Hello! Your profile aligns really well with a Senior Data Engineer position I have open. Can you share your contact information if you're interested?

The role is here: https://courted.breezy.hr/p/8fe5f20cfb72-senior-data-enginee...


This is really cool! I'd love to seed the GAN with my own artwork and generate new pieces in my style


Thanks! Depending on how many artworks you've created, it might be difficult to train a GAN network on them (due to overfitting). What you might try is to train one network with a lot of random artworks, then use a Style-transfer network to convert the generated pieces into your style.


couldn't you use something like style transfer to take your own artwork's style and apply it to the generated art?


Yeah exactly, that's what I meant!


do you have the code for training ? i wasnt able to find it in your repo. That would be so cool!


I left it out as I wanted people to use their own generative art. Here's the implementation I used: https://github.com/taki0112/StyleGAN-Tensorflow


Hey thanks! so just put your pictures and run the python script ? and we can take the model and drop-in replace in yoru code ?


Sorry for the late reply. Yeah, I used that repo to train the NN. Then I modified and reduced the code to only include the generative part, and that is what I added to the Kiosk-code.


Do you actually want to disseminate your work and have many other people try it? Or is that not actually a goal.

Right now it really feels like this isn't a priority. Which is fine.

But if it is a priority that other people replicate your work, I'm not sure you're making this as easy for people as possible.


My intension is not for people to replicate my work (the trained GAN-network), but rather supply a tutorial over how to build the installation. Then people can add their own generative art/code. It could be ML-generated, or traditional "code-art".


this comment seeded an interesting idea! Many artists and photographers want to get into the NFT space but they don't necessarily have experience in digital art creation.

If you could leverage AI to generate digital art based on real artist/photographer inputs, perhaps you could create a nice little marketplace business.. or maybe just a simple AI generator plugin for an existing marketplace..


Most artists (with some exceptions) want to have nothing to do with AI generative art. They will simply continue to produce art the way they do with older technologies such as paints and brushes, musical instruments, film equipment, writing tools, and so on. Art making involves a process, a state of mind and there's always a human behind it who digests everything around them and spit something out. All these imitative AI art are beautiful in their own way but really have no substance; once the wow factor weans out they won't have much of a leg to stand on in my opinion. Art making is a self discovering journey at the same time.

Having said that, I'm curious and somewhat excited to see how these will evolve. As I said, I find them beautiful. As a painter myself there is nothing out there that will make me not paint. Sure, I sometimes use tools but there's always the me in there who is in control or driven by my human instinct.


As someone who has been a part of the demo scene in the 90s I find this offensive.

Generative art can be a wildly creative process on par with anything a painter works through. It's just a different medium.

The way you express yourself through art is not threatened by people choosing other ways using different tools. Painting did not become obsolete because someone invented art photography.

I do agree with the sentiment about NFT 'artists' though. Copy pasting a colab notebook, replacing a string and selling the result as NFT is just idiotic.

I wonder who the bigger fool is. The one who sells or the one who buys.

Demoscene Wikipedia https://en.m.wikipedia.org/wiki/Demoscene

Wired article about the demo scene from 1995 https://www.wired.com/1995/07/democoders/


A lot of what we call "art" now merely means "pleasant picture", with no regard to the artist intention or any kind of novelty.

I could see AI "art" competing on price with stock pictures and cheap illustrations for throwaway/placeholder usecases, or when uniqueness is preferred.

It's already happening with social media profile pictures for instances. Next up could be the filler artworks in hotel rooms.


That's good insight. I am not an artist but I've also run this idea by a friend who is. He loves the idea and is curious what a technological interpretation of his work might look like. He's also interested in how we might use those interpretations to create a new segment of collections for this brand.



Made by humans and that is key. They digest what came before and spit out something else, a remix as you call it but which does veer in different directions over time


i don't know any real artists who actually want to get into the NFT space, only con-artists.


Thats what I am thinking about, what kind of images to train on


Or seed it with XKCD.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: