Hacker News new | past | comments | ask | show | jobs | submit login
Anti-Patterns (christianfindlay.com)
70 points by pplonski86 on June 2, 2019 | hide | past | favorite | 32 comments



>To qualify for the term “pattern”, an approach is not worthy until someone or some group has proven through some formal process that this approach is superior to all others, and qualifies as “best practice”.

Since when?

>An approach that earns the title of “pattern” not only becomes one tool in the toolbox, it becomes dogma.

Well, this definition seems a wee bit dogmatic.

>Everything that runs contrary to a given pattern becomes an “anti-pattern”.

This is nonsense.

edit - If I come up with a nicely thought through set of integrated UI elements that have not gone through any formal process or testing and asked someone what they thought of it as a pattern, I might expect to be asked, "How do you know it is a good pattern?"

However I would not expect anyone to tell me, "You cannot use the term 'pattern' as it hasn't been proven through a formal process that this approach is superior to all others, and therefore qualifies as 'best practice'." If anyone did tell me this, I'd be looking for someone else to get feedback from.


In the original definition of "design pattern", it is a solution to a problem that has been generalised across multiple domains. In fact, a "pattern" is not a pattern unless it has been independently discovered by multiple different groups (I believe the original suggestion was 3). The point of a design pattern is not that it is superior it is that it is common.

A design pattern is most assuredly not a best practice. In fact, it makes little sense to talk about a design pattern in isolation. It is a solution in a context. In some contexts it is appropriate and other contexts it is not appropriate. Design patterns used to have large preambles discussing the contexts where they were appropriate and where they were not.

The point about an anti-pattern is that it is surprising. It must be a common approach that looks like it should work, but it doesn't. It is definitely not a "bad practice", or an "I don't like that" practice, or even an "inappropriately applied pattern". It's a surprising result that you should be aware of that tends to catch people frequently.

Here is the first reference I know about discussing design patterns in software: https://c2.com/doc/oopsla87.html (Yes, the "87" refers to "1987"). People who have only read about design patterns in the last decade or two are almost certainly missing the original point (as seems to be sadly common with these kinds of things).


To me, antipatterns are common modes of failure.

Take for example a load balancer that is configured to send traffic to the fastest server at the time. Makes sense, if a server is fast it has more available capacity to handle requests. But when the front end code fails the server short circuits, skipping all the expensive code as it unwinds the stack and returns a 5XX error, the load balancer that just sends to the fastest now sends ALL traffic to the failing node.

The antipattern is a feedback loop that is not measuring the right thing. In this case, they should be timing _successful_ calls to the front end.


^ came here to post pretty much this; author seems to try to make a fairly good point, but the way it comes out is a bit strange and lacking foundation. It almost reads like the author had a bad experience online or in real life where he was told he used an anti-pattern where he actually wasn't and now extrapolates this. Don't know, just guessing here.

Sure there might be a group of not too well-informed/skilled people religously believing patterns must be used whenever even remotely possible and not using patterns is bad, mkay, but I think (hope?) the group considering them as just another tool in the toolbox, used when appropriate but not otherwise, is larger.


Sure there might be a group of...

Several. I've not worked in such an environment myself, but I've had colleagues who have; workplaces where every design is permitted only to use concepts from the big book of patterns and any situation to which they are inappropriate is nonetheless solved (badly) with that big book of patterns.


>workplaces where every design is permitted only to use concepts from the big book of patterns

Can you name any of these workplaces?


The last chap who comes to mind I worked with who had previously worked in such a company was named Richard. Can't remember his last name, do remember I gave him a Wallace and Gromit moving toastrack as a leaving present when he went on to work for a flight sim company in the south west of England somewhere.

I last worked with him when I was working for a UK offshoot of a US Defence company. This would have been somewhere around... six years ago, I guess.

I suppose I could try to dig out his contact details and ask him, but I won't. So the short answer is no, I can't name that company. Got a feeling it was something involving printing, or imaging.

I also cannot name any of the companies that any of my current colleagues have previously worked at. Is everyone else keeping track of where all their colleagues and ex-colleagues have worked? Wouldn't be surprised. I don't keep track of any of that. I'm not even on the facebook.


>So the short answer is no, I can't name that company. Got a feeling it was something involving printing, or imaging.

Shame, I was hoping to avoid them. I do know where quite a few people I have worked with have worked previously, purely because this kind of thing can come up in discussion and my memory sometimes functions.

>I'm not even on the facebook.

Congratulations. I've managed to never be on it either.


Thanks for making me appreciate my imperfect job more :)


This is nonsense.

Is that not the very point being made?


Well, yes, but is there such nonsense to begin with? I.e. where does the original statement come form? Where is the canonical source claiming whatever is not a pattern, is an anti-pattern?


I have certainly worked in places where, for example, code consistency is rated so highly that writing bad code consistent with the existing codebase is considered preferable to writing good code. Quite literally pattern and anti-pattern.


Are you talking about formatting conventions or something else? Because if all you’re talking about is coding style, then yeah, it’s infinitely better to comply with the conventions of whatever your working on than do your own thing. For something as subjective as coding style, consistent vs inconsistent way more important than whether one style is used vs another one. That almost goes without saying.


No. I'm talking about things like endlessly copying horrifically monstrous multiple inheritance patterns, and painful single-threaded state machines instead of using threads to spin off work, and using shonky home-grown reference counting (that's thread-dangerous) instead of standard library smart pointers, and writing your own polymorphism instead of just using the inheritance that the language comes with, and leaving 80% of CPUs idle while trying to jam all the work into a single thread, and rewriting pieces of industry-standard libraries which then makes upgrading impossible, and writing horrifically heavy wrappers around simple DB access that turns modern DB interaction into sludge for the purpose of solving problems that the previous product had and the current product doesn't.

Take all that, and then insist that all new code do that as well. That's what I'm talking about.


The term pattern is not dependent on things being proven through formal processes and anti-patterns are not defined as things that are not patterns.

Anti-patterns can be patterns that have been proven through formal processes and the most successful of them tend to be.


Multiple misunderstandings here. Most importantly the author seems to have misunderstood the quote from Wikipedia defining an anti-pattern. The definition said an anti-pattern is a commonly seen solution which is net detrimental AND for which a known net-positive solution exists.

But it’s important to note that anti-patterns are still patterns, ie, they are commonly seen solutions. They just happen to be poor ones. But your innovative solution, whether it’s good or bad, can’t be an anti-pattern, no more than it can be a pattern. Because it’s new. No one else is using it. It’s not a pattern yet.

Likewise the author seems to have missed the AND part, and spends a good deal of time ranting that just because a solution exists doesn’t mean any other solution is an anti-pattern... well, yes, exactly!


The author provides no examples of "Anti-patterns" that should or should not be called anti-patterns. He just claims it's derogatory and is critical to people's opinions.

To quickly label some solution as an anti-pattern is useful because it's efficient. We repeat the same mistakes and it's usually not worth the effort to make a detailed analysis for every single mistake developers do just so their feelings don't get hurt from "derogatory" anti-pattern-labeling.


> He just claims it's derogatory and is critical to people's opinions.

Silly us, right? Here we were, all this time, thinking that "anti-pattern" is a good-job-pat-on-the-back from old dad!

Derogatory and critical, humph! Who woulda thunk, ya know?


the whole 'pattern' and 'anti-pattern' terminology robs technical discussion of its meaning.

can't we just say things like 'maybe writing a bespoke build system isn't the best use of time right now', or 'there is this really cool range data structure I saw that seems really relevant, here's a link', or 'we keep having consistency problems cleaning up objects to put them back in the pool, maybe we should just always reinitialize them so that all that code is in one place'

trying to boil down all software to 50 simple shapes that we can assemble is just way too reductionist. think about what you're doing. use your words.


Patterns aren’t reductionist, they’re just names for things that are common so that it’s easier to talk/think/write about them when those are the things being discussed. Surely no one has ever told you that all code you write has to conform to some set of predefined design patterns. That would be an engineering antipattern right there. Like most things in design, they’re supposed to help you think more clearly and concisely about common problems, they are absolutely 100% not intended to circumscribe your solutions.


It is funny like some programmers, especially from Lips/functional programming equate term "design patters" with GoF book and their list of 23 specific patterns, and then proudly declare that "Lisp does not have/need design patterns". Of course it has, just different ones than Java/C#


> The term “anti-pattern” is a derogatory term used to disparage software design approaches that a given developer, or group of developers may not like.

So right off the bat you have an author doing the exact same thing he accuses other of doing, abusing the meaning of a term as bludgeon against pet peeves. And yet here he is, in the very first sentence of the article, lying about what an anti-pattern is because it makes his pet peeve look more legitimate. And it doesn't get that much better in the rest of the article.

The complete lack of self-awareness of some people is absolutely astonishing.


The exact logical misstep made in this article is at the point where it is somehow assumed that saying a single solution is wrong (in this case by calling it an antipattern) is only done because someone believes a single solution is right (by calling it a pattern).

This is completely fallacious, as there may be many commonly executed good and bad solutions. We would just like to differentiate between which ones nearly always lead to dramatic costs later, and which ones nearly always lead to low costs later.


Software Design Patterns are an Anti-Pattern.

They could be used as a terminology for commonly recurring patterns. But ask your colleagues about the difference of a facade an adapter and a decorator and you will see that in reality they are useless for that. The chosen names are just too generic to have a precise memorable meaning.

People think they can use it as an abstraction, but for most patterns it’s not abstracting in a way that programming languages support. The only thing I can do with an adapter is add it to the class name and feel intelligent. People tend to do that less to real abstractions (List instead of SequentialIteratorFactory or IteratableMonad).

The most offensive thing people are doing with design patterns is trying to use them like bricks to build code. You will end up with code that’s 90% boilerplate and 10% actual logic strewn all over the place.


> The most offensive thing people are doing with design patterns is trying to use them like bricks to build code. You will end up with code that’s 90% boilerplate and 10% actual logic strewn all over the place.

This is absolutely one of my greatest pet peeves with the developers I have been interacting with the past several years. It is almost as though they ignore all functional and non-function requirements set forth before them, instead placing a priority on having no fewer than four `AbstractSingletonFactoryAdapor`s. The result is the complete abuse, and misuse, of these patterns that is just impossible to follow and almost always incorrect (Really? A singleton? Do you REALLY think that there will only ever be one logged in user during the life of the JVM? Or did that just HAPPEN to be the case while you were sitting at your desk pounding this out?).


“...Design patterns are formalized best practices...”

(From Wikipedia, apparently, not the author of the article.)

No idea where that idea came from, but this probably just means Wikipedia should be updated. Design patterns don’t need to be “best practices”. Also, that term is mostly useless to use in the definition or description of something since it’s a highly contextualized term.

It looks “Formalized” should also be dropped. It should say something like “defined and documented” instead. As it is, I think it can be confused (as the author of the article has) with referring to having gone through a formal verification process.


> Declaring one of these as correct and all others to be incorrect is an oversimplification, and a logical fallacy.

I think the root of this black-and-white thinking (the is a right answer to “what pattern should I us?”) & the social disparaging of making a mistake/being wrong both stem from logocentrism in Western critical thought.

The attitude of “being right trumps any objections” is reinforced by CS grads trained by examinations where there are correct answers and doesn’t set them up for the wide world where clients don’t want the technically correct solution when it comes at the cost of delivery speed, brittleness, inflexibility, etc. This is one place where Software Engineering is superior, since it trains students in various techniques and the associated trade-offs. Imagine civil engineers & architects taking the position that, since A Pattern Language says that four square walls and a peaked roof exemplifies the timeless way of building, they insist that is the only way to construct a house.

The real value-add of patterns is giving a standard vocabulary/shorthand for discussing designs without getting bogged down in the boring technical details of the implementation. Uncritically cleaving to them as some sort of defence of quality is pure cargo culting.


IMO the authors misses the aspect of scope.

The term "Anti-Pattern" is not used in a vacuum. Instead, it usually refers to a specific term, like >Usability< or >Static Websites<, for example. If I refer to usage of fuzzy, thin fonts on mobile as an anti-pattern, it does not mean every artistic collage with this feature is automatically shit. If I live in 1820 and tell people not to use metal for horse riding gear, I'm not judging the possibility of a superior means of transportation made out of metal.

But for 99% of the people who just want to build something with a technology, not revolutionize it, the word is still a good metaphor to guide to the right path in the given time and place.

Of course, as most things, anti-pattern defining shouldn't escalate in dogmatism, but I don't follow the conclusion that the metaphor itself is destructive or bad...

edit: grammar


Anti patterns imho:

Singleton

Inheritance as a means of code sharing (vs polymorphism)

Premature optimization

Manual resource management (vs RAII)

Implicit ignoring of errors


This article is an anti-pattern.


This article is incoherent.


This article, like most, rides on the idea that Patterns are Good. But each pattern really represents a failure in our languages and language ecosystems. An anti-pattern is just a bigger failure.

We have a different name for the true successes: Libraries. When we can encapsulate a solution to a recurring problem, and make a clean interface to it that does not dictate the architecture of the whole system that uses it, we put it in a library, document it, and maybe publish or even standardize it.

The merit of different languages for constructing systems (as opposed to myriad other uses for languages!) turns on how effectively you can build, deploy, and use libraries in those languages. Each place where no library can be constructed or used, requiring deployment of a pattern instead, represents a failure of the language design or ecosystem to enable encapsulating and generalizing the idea. In a Better Language, we would use a core language feature or library, and not need to code the pattern over, yet again.

When we propose extending a language with a new feature, our best arguments for the feature arise from examples of libraries we could write, that we cannot, now. Or, more frequently, are of how it would enable better encapsulation and generality for the kinds of library we already use. (Previously impossible libraries are usually conceived later.)

This is the reason that C++ still dominates in system implementation, despite that its most devoted users are its worst critics. It originated the goal of the zero-cost abstraction, has carried it farther, and is still moving.

As an example, the hash table is a common Pattern in C. C coders are always writing custom hash tables and hash schemes, because you too often can't get tolerable performance from a library version. In slow languages, the performance overhead is considered negligible, and the built-in hash table dominates. Meanwhile, in C++, it would be foolish to code a single-use hash table. The std-library one is fast and forgiving; and better ones, deeply analyzed and optimized with detailed tradeoffs (entry stability, modify vs lookup speed, etc.) and thoroughly tested, are easily found and dropped in.

Sometimes a library can't displace a pattern, and you need a core language feature. C++ has templates. Rust has its borrow checker. C++ and Rust are getting "await". In languages with the right bent, these core features combine and breed to provide encapsulation for ever more powerful libraries.

Powerful libraries are not just a convenience, saving coding time. They add correctness, at scales large and small, not available from patterns. They amortize and capture optimization effort far beyond what would be affordable for single-use code.

Obligate garbage-collection is poison to a powerful-library ecosystem, because the effects of tradeoffs needed to meet performance requirements reach deep into the shared semantics libraries rely on. A powerful library for an obligate-GC language would need to accommodate all the different ways GC might work. In practice, nobody has time for that, so library dependencies in such languages instead constrain the tradeoffs, and many libraries are likely not to be usable in a system that must make them. With fewer uses to amortize over, libraries get less attention, and are fewer.

C++ shows its age, but nothing to replace as a powerful system language is above the horizon. Rust might get there, eventually, if its minders raised their sights, and if development of C++ were to slacken.

Most coding, meanwhile, is done on smaller systems, in more forgiving environments, where abstraction overhead is tolerated. There is little pressure to enable replacing patterns with libraries. Patterns are still in the vocabulary there.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: