This weekly roundup thread is intended for all culture war posts. 'Culture war' is vaguely defined, but it basically means controversial issues that fall along set tribal lines. Arguments over culture war issues generate a lot of heat and little light, and few deeply entrenched people ever change their minds. This thread is for voicing opinions and analyzing the state of the discussion while trying to optimize for light over heat.
Optimistically, we think that engaging with people you disagree with is worth your time, and so is being nice! Pessimistically, there are many dynamics that can lead discussions on Culture War topics to become unproductive. There's a human tendency to divide along tribal lines, praising your ingroup and vilifying your outgroup - and if you think you find it easy to criticize your ingroup, then it may be that your outgroup is not who you think it is. Extremists with opposing positions can feed off each other, highlighting each other's worst points to justify their own angry rhetoric, which becomes in turn a new example of bad behavior for the other side to highlight.
We would like to avoid these negative dynamics. Accordingly, we ask that you do not use this thread for waging the Culture War. Examples of waging the Culture War:
-
Shaming.
-
Attempting to 'build consensus' or enforce ideological conformity.
-
Making sweeping generalizations to vilify a group you dislike.
-
Recruiting for a cause.
-
Posting links that could be summarized as 'Boo outgroup!' Basically, if your content is 'Can you believe what Those People did this week?' then you should either refrain from posting, or do some very patient work to contextualize and/or steel-man the relevant viewpoint.
In general, you should argue to understand, not to win. This thread is not territory to be claimed by one group or another; indeed, the aim is to have many different viewpoints represented here. Thus, we also ask that you follow some guidelines:
-
Speak plainly. Avoid sarcasm and mockery. When disagreeing with someone, state your objections explicitly.
-
Be as precise and charitable as you can. Don't paraphrase unflatteringly.
-
Don't imply that someone said something they did not say, even if you think it follows from what they said.
-
Write like everyone is reading and you want them to be included in the discussion.
On an ad hoc basis, the mods will try to compile a list of the best posts/comments from the previous week, posted in Quality Contribution threads and archived at /r/TheThread. You may nominate a comment for this list by clicking on 'report' at the bottom of the post and typing 'Actually a quality contribution' as the report reason.
Jump in the discussion.
No email address required.
Notes -
There has been a lot of hype news in robotics + AI lately, as the AI updates just continue to come at a blinding pace. From Tesla/XAI we have the Optimus robot, which I can't tell if this is a major breakthrough or just another marketing splash driven by Elon.
On the other side of the fence, you have Nvidia releasing an open foundational model for robotics and partnering with Disney of all companies to make a droid robot.
You also have Google's I/O, which I haven't had the energy to look into.
With the speed of AI updates and the wars of hype, it's always hard to tell who is actually advancing the frontier. But it does seem that in particular robotics are advancing quite rapidly compared to even a couple of years ago. Personally I think that while automating white collar work is useful and such, AI entering into robotics will be the real game changer. If we can begin to massively automate building things like housing, roads, and mass manufactured goods, all of the sudden we get into an explosive growth curve.
Of course, this is where AGI doomer fears do become more salient, so that's something to watch out for.
Either way, another day, another AI discourse. What do you think of this current crop of news?
It is crazy to me that most people alive today will be around to see how this - this journey of civilization, this grand process of technological development - ends, or at least moves far, far beyond us. There is a millenarian tension in the air. Paradise or extinction (at least for most people), it seems increasingly clear it will be one or the other.
Yeah, a lot of contradictory thought patterns emerge if I ruminate about the future more than like 2 years out.
Should I live as conservatively, frugally and healthily as possible to ensure I make it to there in good shape, or should I be more reckless and try to enjoy life as much as possible since it could all end? (obviously if EVERYBODY does the latter, we might not make it there at all).
Assuming we survive, are we bound for a future of exploring the stars and colonizing new worlds, or do we get stuffed in VR experience machines that satisfy every psychological desire we have without going anywhere? Will I even have a choice?
Is there any point in breathlessly following every notable development in the AI/Robotics space to try and guess when the big moment will arrive, or would it be more constructive and mentally healthy to divorce almost entirely from it and just read escapist fiction all the time so I don't worry about something I can't really control?
Should I continue to behave as though I expect society to persist into the next century and thus be very concerned about e.g. birth rates, pollution, government's fiscal policies, and/or immigration policies? Or does none of this matter in 10-15 years, and thus I should just do the bare minimum to keep things running but hey, let the kids do what they want in the meantime. The AI can fix the mess later.
It is in my nature to prepare, both mentally and financially, for things to go south. I don't buy the hype and promises without skepticism, but I can't deny that every 6 months for the past, what, 3 years? The SOTA models have demonstrated new capabilities that check another box off my "is it smarter than humans?" list. The temptation to just give up 'trying' and go with the flow is strong.
A bit of optimism, I do believe that I'm young and healthy enough that I'm likely going to be around when we reach Longevity Escape Velocity, if the AGI stuff never fully manifests we've got all the pieces to fix most age-related problems in humans so as to give us functional immortality by 2050. Which will create a whole host of new and exciting issues if the AGI isn't already in charge.
I have the same doubts. It’s hard not to care, because for now the problems still exist. It will take solving them to end those concerns.
As for the rest of life, we brought some things forward and are probably living a little faster. There are things I want to do and experience, but most are regular life milestones. Going full hedonist and spending all the money / becoming a drug / sex / gambling / food addict doesn’t seem to make the people who do it happy, end of the world or not.
Add on that there's so many people I care about who are just living life without much awareness of what appears to be on the horizon... and it feels literally impossible to explain to them why they should perhaps care a bit about what we're seeing. There's so many disparate chapters of lore I'd have to catch them up on so they can see the whole picture like I do, I'd look like the crazed conspiracy theorist with red thread until they finally got up to speed and it clicked for them too... if it ever does.
Bit lonely being unable to bridge the gap on a topic that I find important. Hence why I'm here.
There's wisdom in that, but I can think of certain things I could be indulging in that would ABSOLUTELY make me happier, and I would do more of them if my time preference where about 5-10% higher.
One thought that springs forth recently: If I quit my job and sold my house and everything in it, I could afford to buy a decent camper van and then take a year, maybe two to drive around the Country with my dog. And why not? It won't hurt anybody, and I'll rack up a pretty fulfilling experience that will take my mind off the pending event. And that's without touching my own (modest) retirement savings. Which reminds me: What the FUCK am I supposed to with with a 401(k) as someone who is under 40?
Seriously, although I understand the benefit of having a money stash that you can't easily touch, the idea that I will want to keep adding to this pile of money that I will be unable to draw from until I'm in my 60's feels farcical under current expectations. Like, I just do not believe that the future is one where I diligently tap away at a series of steady jobs, watch my savings grow over a couple decades, and then have to draw on that money in old age for a peaceful retirement.
Can someone lay out the path to 2050 where the most likely outcome is that the market grows about 5-7% every year on average, we don't have a debt crisis, or catastrophic event, OR an AI-fueled industrial revolution that pushes things parabolic for a bit, and I, when I hit 67, will be SUPER grateful to my past self for diligently squirreling away U.S. Dollars (rather than betting on BTC, for example) over that whole period.
I will grant, if I cash in all my chips now and the "NOTHING EVER HAPPENS" brigade is right, I'd look very stupid later. And the Gods of Copybook headings have been undefeated for centuries.
But even if 'NOTHING EVER HAPPENS,' there are still enough small happenings that keep piling up that it really seems like the standard assumptions that go into the ol' "Put aside 15% of your pretax income in an index fund and never touch it" advice are not going to hold over the future. I don't think there's a reason to give up on saving entirely, but it suggests one should be taking wilder risks and being much less concerned with historic returns as a guideline for future probable outcomes.
These both sound terrifying to me.
The investment value of BTC is either an underlying "BTC will become so convenient to transact with that everyone will want to keep balances in it" (which looks less likely to happen the longer it goes without happening) or a meta "you can sell your BTC to someone who'll pay even more for it for some reason" (which happens, but can't happen forever without a non-circular reason). The investment value of USD has an underlying "everyone in the US needs some to pay their taxes instead of going to jail", and that's great, but at some point either we're going to get the federal debt under control or we're going to monetize it and dilute your USD to nothing, and I'm not betting on "get the federal debt under control".
This is less terrifying. Sure, if the ASI kills everyone and/or mandates a Socialist Utopia then you're wasting a sweet camping-with-the-dog opportunity, but if property rights retain any respect then it'll be good to have equity in a wide enough array of investments to definitely include some companies who'll manage to surf the tidal wave rather than be crushed by it. A crashing dollar is going to hurt stocks but not as badly as it's going to hurt dollars.
Personally, I just wish I knew what to advise my kids. My index funds are at the "can pay for college if they don't go to med school" level, not the "idle rich" level. Even if AI progress levels off below superhuman, it looks like it will level off at somewhere around "can interpolate within the manifold of all existing human knowledge", and how much economic room is there for the vast majority of human knowledge workers in a world like that? Being able to personally push the boundaries of knowledge into previously uncharted territory used to be what you needed to do to get a PhD, not what you needed to do on a regular basis just to remain economically viable.
The Bitcoin Maxi case at this point is that it is digital gold. More fungible and easier to store, and readily convertible to whatever currency you need. I don't buy it all myself, but Bitcoins ongoing survival is proof of something.
And if you think a dollar collapse is pending, then BTC is probably where people flee to in at least the short term.
I think I can make a case for NOT following certain paths, but as for actionable "Do this to prosper in the future" advice I am at a loss. Its not like you can just say "Plastics" and nudge them off in the direction of the next big technological gold rush.
10-15 years ago "learn to code" would have been SOLID advice. No longer. I'm increasingly reading that AI models are really good at various parts of the practice of medicine... and SUPPOSEDLY robot surgery is here. So the Med school investment looks a bit questionable.
And as for college funds... why should colleges even be a thing, at least with their current business model, when AIs are generally capable of teaching at the level of even the best professors, across any subject?
Yeah, for Gen Alpha, there is probably NO career advice that previous generations can offer them based on experience other than "wait and see."
I can imagine a world where the AI is doing all the knowledge work but keeps giving humans various tasks that it needs to complete in order to push the boundaries of knowledge forward. Tasks that will seem completely nonsensical to the individual performing them but in the aggregate allow the AI to improve things, iteration after iteration, and thereby keep most humans 'employed' and paying them in some currency they can spend with other humans and thus the 'economy' chugs along but in effect everyone on earth is a 'gig worker' who gets tasks assigned to them as needed, and gets rewarded for performance.
A scarier version is that the AI requires you to be Neuralinked up to it so it can inject arbitrary commands into your brain as needed, but also rewards you handsomely for helping out.
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link
More options
Context Copy link