@Porean's banner p

Porean


				

				

				
3 followers   follows 1 user  
joined 2022 September 04 23:18:26 UTC

				

User ID: 266

Porean


				
				
				

				
3 followers   follows 1 user   joined 2022 September 04 23:18:26 UTC

					

No bio...


					

User ID: 266

Then you've missed the point of the article entirely? It's an election prediction site. Trying to put forward a case for a Republican electoral victory. It would be very odd and partisan to portray Redd as an anti-strategist that doesn't care about the outcome and "prefers to die on that hill and lose election".

where did you learn that from?

I've always wondered if the parentheses attention format was intentionally designed for humour.

Hi, I just want to leave a stub response: you seem right and I failed to type a recent response after reading 2 days ago.

You were MetroTrumper? Holy shit.

  1. We aren't important enough. We have about a dozen thousand users that do not-much more than words-words-words in a closed community.

  2. We have some pretty good programmers onboard. The codebase is probably not clean right now, but I think it's a matter of time.

Any independent replications?

Sure.

OCR-VQGAN

Ah, interesting!

My instinct is that this should be smaller and easier than the Stable Diffusion I run on my PC, but maybe I am just super wrong about that?

Super-wrong is correct. Nobody has a consumer-sized solution for that, and if it ever happens it'll be huge news.

This is a great response.

(your first two links are the same)

I agreed with the gist of the article, but I can't help but wonder if this topic was 99% covered by LW at some point.

Actually, just assume I'm wrong. I don't have the links

But what will the Program be?

Will it be state persecution of racist AI developers to protect disadvantaged minorities? A corporate utopia of AI-driven capitalist monoculture? An anarchist-adjacent future of AI empowered individuals purging the remnants of the old world?

Or maybe just foom and we all die. That's why I think it's worth discussing!

Text, audio-vqvae, image-vqvae (possibly video too) tokens in one stream

How do you suppose it reads tiny words with a VQVAE? Even an RQVAE shouldn't have the pixel precision needed to see tiny 5px font letters.

Training consumes far more matmuls than inference. LLM training operates at batch sizes in the millions -- so if you aren't training a new model, you have enough GPUs lying around to serve millions of customers.

or?

I miss the past.

China isn't suffering from stagflation right now like the rest of the world. They have inflation of about 2%, there are worries about inflation being too low. This is because they didn't print huge amounts of money as stimulus. And the damage to the Chinese economy? According to the Asian Development Bank, Chinese growth will drop to 3.3% this year thanks to Omicron and these lockdowns. US growth is somewhere around 1.5% and there's a recession looming. The US and the rest of the West is being forced to raise interest rates to reduce the growth that we paid for with stimulus.

I see absolute figures for two groups that did not start with same absolute numbers.

Resolve?

Delete (yes -- delete) all distractions. Mute everything. Lock your phone in a safe. Ensure that the only kind of rest you're permitted is passing out on the floor from exhaustion.

Okaay I have no idea what's going on with the comment box. The link I have in there right now when I click the edit button is:

https://streamable.com/e/e/ollvts

but it's getting rendered as

https://streamable.com/e/e/e/ollvts

Roughly speaking, I see your point and agree that it's possible we're just climbing a step further up on an infinite ladder of "things to do with computers".

But I disagree that it's the most likely outcome, because:

  1. I think the continued expansion of the domain space for individual programmers can be partially attributed to Moore's Law. More Is Different; a JavaScript equivalent could've easily been developed in the 80s but simply wasn't because there wasn't enough computational slack at the time for a sandboxed garbage collected asyncronous scripting language to run complex enterprise graphical applications. Without the regular growth in computational power, I expect innovations to slow.

  2. Cognitive limits. Say a full stack developer gets to finish their work in 10% of the time. Okay, now what? Are they going to spin up a completely different project? Make a fuzzer, a GAN, an SAT solver, all for fun? The future ability of AI tools to spin up entire codebases on demand does not help in the human learning process of figuring out what actually needs to be done. And if someone makes a language model to fix that problem, then domain knowledge becomes irrelevant and everyone (and thus no one) becomes a programmer.

  3. I think, regardless of AI, that the industry is oversaturated and due for mass layoffs. There are currently weak trends pointing in this direction, but I wouldn't blame anyone for continuing to bet on its growth.

If it does then it will be smart enough to self-modify,

This does not work out the way you think it will. A p99-human tier parallelised unaligned coding AI will be able to do the work of any programmer, will be able to take down most online infrastructure by merit of security expertise, but won't be sufficient for a Skynet Uprising, because that AI still needs to solve for the "getting out of the digital box and building a robot army" part.

If the programming AI was a generalised intelligence, then of course we'd be all fucked immediately. But that's not how this works. What we have are massive language models that are pretty good at tackling any kind of request that involves text generation. Solve for forgetfulness in transformer models and you'll only need one dude to maintain that full stack app instead of 50.

Why do you care about what/who's fault it is? You have goals -- accomplish them or don't.

Completely true. Current advances do not guarantee the "no more jobs" dystopia many predict. My excitement is likely primarily a result of how much I've involved myself in observing this specific little burst of technological displacement.