Hacker News new | past | comments | ask | show | jobs | submit | throwdecro's comments login

Does that $65K represent a trade against USD or a trade against Tether?


On Coinbase Pro (just one example) BTC-USD (real USD, not Tether or any other stablecoin) is currently trading at 66,659.99 USD.


> Does that $65K represent a trade against USD or a trade against Tether?

A fair question to ask, but there are quite a few places where BTC can be traded against actual cold hard cash and not monopoly money (insofar as you consider USD not monopoly money, a shaky proposition in itself).

So yes, USD, not Tether.


USD - you can buy and/or sell on the major platforms for that price.


Being somewhat generous, I don't think it's because they didn't understand the probabilities, it's because they didn't think about the impact of what they were doing at all, beyond the problem they needed to solve.

I.e., the only thing they were considering was whether the number entered at the start of the test would show up in the right places at the other end of their system. They simply didn't consider that an actual phone would be shut off along the way.


Does R2 have fewer expensive footguns than AWS?


R2 isn’t even available yet, so only time will tell


Yes.


I wish they'd offer to remove the touchbar from the laptop I have.


> Your machine has excellent resale value, because it's a Mac.

If the laptop is from Apple's garbage era of 2016-2019 who would pay good money for it? Macs don't unconditionally have excellent resale value, the resale value depends on the quality of the item.


A lot of people will pay very well for those machines. I hear that you think otherwise, but that's because you haven't actually bothered to check. I agree with you that the 2016-19 machines are inferior, though.


Put it on eBay with an auction starting at 99c and see where it ends up, you'll be surprised.


> Which is widely available and effectively free

What? Who is going to fight off people trying to obtain or destroy that USB drive to increase the value of their own holdings (i.e. speed up deflation by reducing available bitcoin), or extract the passcode from its owner, for free?

EDIT: I realize this is the plot of Goldfinger, but the author did invoke Fort Knox, and the strategy of destroying stores of value to increase the worth of your own stash makes sense.


Who does it now?

I don't see how it follows that if bitcoin is useful that somehow militaries or rule of law can't exist.


> Who does it now?

France sent a warship to escort its gold from the US to France. Wasn't free, probably wasn't cheap.


Luckily you don't need a ship to transport bitcoin, you could use a telephone or TLS 1.3

Edit: or just keep your private key (which is best practice)


Conveniently, USB drives can be backed up.


Sure, but anyone holding that much wealth could expect some kind of "advanced persistent threat" working against them to either destroy all of the backups, or transfer the coins to /dev/null after extracting any passcode from the owner. The keys and anyone that knew how to use them would need a high level of physical security. The "costly militarized facilities" would still be necessary, contrary to the author's statement.


Who actually wants this? I don't want to stare at digital copies of people I'm trying to communicate with, whilst trapped in a digital maze designed by someone trying to keep me engaged. If the "digital twin" is exactly the same, there's no point, and if it's not exactly the same, it's just weird (i.e. "uncanny").

This "metaverse" is some kind of deranged trap. Social media is a hellscape of misunderstanding and conflict already. A further layer of indirection and reduction in "humanness", in the form of "cartoonifying" everyone, is going to further erode society's foundations.


>Who actually wants this?

Folks who played Secondlife, Farmville, Animal Crossing, etc. But I think that part of the problem seems to be that some of us have negative emotion towards Facebook, and we feel uncomfortable even just knowing Facebook is working on this.


It's kinda funny, the Japanese anime space has explored the idea of a metaverse extensively if this is the case.

Although for me "metaverse" is the stuff Marvel does to ensure every super hero becomes dull and boring


This is a rant I can appreciate. Two subjective takes on the same subject:

1) As JavaScript became ascendant, it seems like developers lost their fear of dependencies, for some reason I don't understand. Dependencies came to be seen as "time saved" rather than "something out of your control that can hurt you."

2) When dependencies don't fit together, things fail in a way that makes it look as if the code is simply wrong. Yet everyone working in the environment ends up conditioned to run to google with the combination of (circumstances, error string) to find someone who has figured out what the actual dependency issue is. It's automatic behavior. At some point it would make sense to include automated googling in the CI pipeline just to add some necessary information to the errors.


I agree strongly and think once you regain your fear of dependencies the JS ecosystem actually isn't that bad. The kinds of packages that save me months or years of work are generally of reasonable to great quality, it's packages that (theoretically) save a couple days or weeks that tend to be the real stinkers.


I’ve definitely spent a ton of time on dependency work. I think the web app ecosystem is unique compared to its counterparts because the platform is really not designed for applications; it’s designed to render documents. You get so much more out of the box when developing for proper application platforms like iOS. For example, dealing with dates and internationalization. iOS gives you enough tools to not really need an extra library. Not so with the web.

Plus, the std for these platforms is much larger too, because you get access to the huge API surface the platform provides. Apple has spent a significant amount of resources making the platform decent to develop for. They even include reactive UI patterns out of the box now; something you’ll need a library for on web.

I think that’s the real shame. The web platform is very flexible and ubiquitous, so most people try to target it. But in doing so, you’ll need to account for web standards differences across browsers (using a tool like browserslist), the cost of downloading the source code when navigating into a page (with webpack and minification to make bundle sizes more reasonable or split up), and the cost of not having strong default patterns for stateful, interactive UI.

Every other platform will include nearly all of the tools you need within the IDE and on the platform. Web just doesn’t have that, unfortunately, and it has further problems that really don’t exist on other platforms. (For example, other platforms don’t rely on downloading the app source code from a remote server when launching it every time.) It relies on the open source community to provide solutions to so many problems, resulting in the complex dependency trees we see in large apps.

On the one hand, the OSS spirit is admirable. There are competing projects for nearly everything, similar to Linux programs. (Just like there are many Linux dependency managers, there are multiple JS dependency managers.) This allows projects to “get better” by competing with each other, but at the cost of breaking changes and maintenance over time. (For example, there are many projects aiming to provide the fastest JS bundler.) But there is something to be said for having an opinionated platform with more features like iOS or UWP for windows.

All that to say, I don’t think this is a JS shortcoming, but a shortcoming of the web platform in general. And some of those shortcomings are inherent to the benefits of the platform.


A side rant: for some reason in recent years it has been OK to make breaking changes. I remember a time when these were considered a nuclear option -- now popular libraries will make breaking changes for minor aesthetic reasons, resulting in billions of wasted developer hours changing foobar to foo_bar.


Yup, this is what major releases are for and frankly we need more people like Linus reminding people don't break your external APIs!


Some projects _never_ make breaking changes, and major version bumps represent major additions.


Very true and these are the gold standard in some regards. But not everyone has a crystal ball and are beholdant to external factors they don't control. E.g. the pcie bus evolving forces Linux to change, and not all change is bad.


> At some point it would make sense to include automated googling in the CI pipeline just to add some necessary information to the errors.

Sounds like a decent feature to include in a compiler!


I installed ESLint into a new project and my package.lock file expanded to 3000 lines. I understand what problem it was trying to solve but the sheer inelegance of it sort of gives everything a jank feel, like you have no idea what’s running.

Or perhaps it’s more that it’s exposing a bunch of things that are hidden or tucked in standard libraries in other languages.


hep me understand this. I need a binary called eslint. I put its version x as a dependency. why can't i just get a fat [email protected] blob of code? Why do I have to care to fetch all eslint's dependencies? is this because they don't "bundle" things like eslint? Should they then?


> for some reason I don't understand

Fear of writing original code. There is less perceived risk (actual risk is slightly increased) if you can defer blame.


it's not true that all this mess was caused simply by irrational emotion. There is actual cost involved in writing original code.


How much?

Until there are numbers, as in qualified by data, its all irrational. It doesn't matter that something costs money. What's important is how much more it costs (the difference).


People don't add more dependencies into ther code out of fear that's just a weird idea.

I'm not saying I know the cost, I'm just rebutting the comment about this being driven by fear. It's not, this is cost saving, and might be lousy at that, or not data driven.

By the way there is no such thing as "data driven software methodology", no one has ever done such a thing, so asking for it is just rhetorical.


> People don't add more dependencies into ther code out of fear that's just a weird idea.

Weird or not it is the reason and most commonly expressed as vetted by the community.

> By the way there is no such thing as "data driven software methodology",

There is. A/B testing is a form of that. Running tests and experiments, even against developers, was my job for awhile.

What I find most strange is that this comment expresses the opposite position of your prior comment.


> vetted by the community

I think that's the reason why a certain 3rd package is chosen.

the decision between roll your own and 3rd package often comes down to whether you want to sidetrack engineering _right now_ to write code that's not your product's functionality.

> A/B testing

you're talking about data driven product decisions. What we are talking about is data driven engineering decisions. Show me one article on such experiment?

the first problem is that there is no data. Last I checked developers are not fond of telemetry in their dev tools.

my point is not that's it isn't desirable. It's just not a thing that exists to bring up in arguments.

there is a difference between emotions (fear) and heuristics (this will cost time and is potentially buggy). I mean they are actually same in nature but at different points on the spectrum.


I wish articles like this provided some data as to what "affluent" is. How much money does the 48 year old have, and how much "house", that retirement looks to make sense to them? I'd rather have the dollar figures than the people's names.


Here's what a FAANG software engineer can expect to have at retirement: https://docs.google.com/spreadsheets/d/1Ryu_-mVYxSdJbW8lmf1z...


The idea that the FAANG gravy train can last 30 years is not realistic.

If at least half the letters in FAANG aren't blown away in that time period, it would suggest a tech environment so stagnant that it would be impossible to justify the compensation.

And lasting 30 years at a surviving FAANG company is such a low-probability prospect that it's not something reasonable to include in any future planning.

Very few FAANG engineers can actually expect this outcome.

EDIT: Removed shock word "preposterous", sorry about that.


I almost fully agree with you, but a small part of me wonders if this is different, and we've created a sort of aristocracy that exists outside of normal competition and the rise and fall of companies. In some measure, it's going to depend on modernization of monopoly regulations. But I think there is an outside chance that were due for 50+ years of stagnation and increasing inequality, after the sort of cambrian explosion of tech we've seen since the 70s


Oooh, I like the analogy. Does it mean we're due for a mass extinction event in the tech industry in the future?


every FAANG is almost 20 or over 20 years old. Certainly engineers that have spent those last 20 years (if they haven't already) will retire probably over the figure in the spreadsheet. Will the particular FAANG companies last another 20 (or 30) years? Who's to say but I don't see whatever replaces them not being glad to hire former FAANG engineers. So yeah it's kind of a gravy train that you needn't disembark if you don't want to.


The idea that professional software developers will have their negotiating power depleted is preposterous.

Barring an AGI that can take care of knowledge work, companies will continue to pay a premium to developers because they are often times the core value creators of the business. Even in traditional areas like finance, quants with CS PhDs are displacing Harvard MBA’s trading on fundamentals b/c their returns blow the latter out the water.


There's a difference between "software developers" and "FAANG software developer" though, which is a source of great cognitive dissonance when people see that industry averages are only slightly above $100k/year but FAANG compensation is like 3-4 times that for basic engineering work.

Both are paid a premium, but FAANG is paid a massive premium, and there isn't much precedent for collecting that massive premium continuously for 30 years.

EDIT: Actually in that spreadsheet I would argue that the $250K compensation is too low, the 10% return on investment is too high, and the 30 year timeframe is not sustainable.


Lotta software developers out there at non-FAANG companies getting paid FAANG or near-FAANG wages. Probably more software engineers making 250k+ outside of those five companies than inside them.

Also I kinda question the assertion that these companies being active and important in thirty years would suggest something untoward. In most other industries, the "Blue Chip" companies are pretty durable. JPM's lineage goes back to 1871, and that fact does not prevent its current employees from being well-compensated.


> The idea that professional software developers will have their negotiating power depleted is preposterous.

Lol, I heard the same arguments in the 1990s about webmasters, which at that time were also commanding large premiums over the market median. I also remember when any engineer who touched a linux kernel could make 3x "normal developer" wages. Most FAANG engineers aren't working on anything too special; the biggest competition will be off the shelf frameworks/libraries/application which can do what previously required custom work.

We are also in a period of easy investment money - the biggest threat to FAANG companies is the market demanding a return on their investment - P/E ratios are at historically unsustainable levels. Either "this time is different", or this will all end very badly for a lot of people, just like the first dot-com boom.


Yep. We haven't hit a real bear market in quite a while -- even 2008 was mostly a road bump if you were in tech. We'll see what shakes out when everyone isn't getting trivial 20-30% gains in the market every year.

There are good years, and then there are bad years...


OK, so what is your estimate of net worth for a software engineer retiring today after 30 years in the industry?


https://mobilemonkey.com/articles/employee-tenure-in-tech-co...

Here’s How Long Employees Are Staying AT The 10 Biggest Companies in Tech:

  Facebook – 2.02 Years
  Google – 1.90 Years
  Oracle – 1.89 Years
  Apple – 1.85 Years
  Amazon – 1.84 Years
  <...>


That is misleading. If a company is high growth, their median tenure is going to be low.


The article mentions leaving, suggesting that it does not inlcude employees currently employed


It's a misleading article and the study was misinterpreted across the industry. For example here, "The average number of years at tech disruptors and titans"

It doesn't say average numbers before employees leave: https://insights.dice.com/2017/08/22/tech-jobs-last-2-years-....


I mean I can generally negotiate higher than 20% bump every two years or so going to a new employer, not sure my incentive to stick around longer. My understanding is that’s prevalent throughout the industry


These people are stupid, and solutions like "a combination that only they know" or "disciplinary collars" are also stupid. You wouldn't want to rely on entirely novel, oppressive, and essentially untested ways of maintaining your power, when the price of failure is that you're the hated target of everyone around you.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: