site banner

Friday Fun Thread for July 4, 2025

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

Tooting my own horn. December 1, 2022 I predicted:

My honest bet is that any student currently in their first year of Law School will be unable to compete with AI legal services by the time they graduate. Certainly not on cost. The AI didn't incur 5-6 figure loans for it's legal training.

Put another way, the AI will be as competent/capable as a first-year associate at a law firm inside 3 years.

This was before GPT4 was on the scene. Reiterated it 3 months ago

And then today I read this nice little headline:

Artificial Intelligence is now an A+ law student, study finds

If they can stop the damn thing from hallucinating caselaw and statutes, it might already be there.

But, let me admit, that if we don't see downward pressure on first-year wages or staffing reductions this year, I missed the meatiest part of the prediction.

There's the counter-argument that AI lawyers will actually stimulate demand for attorneys by making contracts way more complex. I don't buy it, but I see it.

Paralleling @JarJarJedi's story, I tried to outsource some annoying research to Gemini and it just gave me the run around. [TW: Boring ops work] I was trying to find a way to copy a large-ish Cloud SQL MySQL DB from one GCP project to another. Solutions I was given:

  1. Just make a clone in another project (not allowed)
  2. Dump the data to a GCS bucket (I said I didn't want to do that in the original prompt)
  3. Use DMS to transfer the data to an empty Cloud SQL instance in another project (sounded promising, was my original thought... but you can't select a DMS instance in another project from the drop-down)
  4. Just use the public IP address of the other Cloud SQL instance as the target (and send my prod DB data over the internet? pretty sure the security team would kill me)
  5. Just use the private IP address of the other Cloud SQL instance (...what? it's in another VPC in another project)
  6. Just peer the VPCs (hmmm.... but no, Cloud SQL instance interfaces actually exist in a "private services subnet" which is part of a hidden Google-controlled VPC which gets automatically peered to the customer's VPC in the same project (A <-> B), and GCP does not support transitive VPC peering (A <-> B <-> C))
  7. Just peer the hidden VPCs in each project directly (they're Google-controlled so we can't create peerings)
  8. Peer the customer's VPC in the source project with the Cloud SQL VPC in the target project (that doesn't make sense)
  9. Just dump the database into a bucket (sigh)

In the end I probably could have just spent 30-45 minutes reading the docs and figured out what my real options were rather than spending severally hours trying half-baked solutions. I just use Gemini for short scripts, text editing operations, and boilerplate now.

[PS. I think we can use PSC to solve the problem above]

Why not set up a machine instance in one of the VPCs, ssh into it, run mysqldump against the source DB and pipe it to mysql connecting to the destination DB? The piping avoids storing it on a disk/bucket.

Why would the security team kill you for connecting to public IPs? Just make sure TLS is enabled and you're good to go. Also if both VPCs are in the same AZ/DC you're likely not going to go over the internet. You might not even go over the Internet between Google Cloud DCs.

Might need to ask an LLM for the right set of options to disable buffering and select the databases you want to clone.

Re. mysqldump, that's what we did last time we had to do this, but I was hoping there was a less manual way. You can do automate anything with enough scripts and DevOps duct tape, but I try to take zero maintenance options whenever I can because I have to scale my meager team and hiring is rough now (because our budget is shit).

Re. the security team, the tech details matter less than the perception. They're more of a compliance team than a security team. Such is life outside of Silicon Valley, sadly.

ha I'm more the opposite. The shell script is legible and easier to reason about. Using some GCP widget is going to not work for some mysterious reason and getting help is impossible if the docs don't cover your use case.

In an ideal world where I have enough guys, I far prefer the script or custom tooling because, as you point out, it's whitebox and easy to fix and customize. But that's not usually the case, plus as a manager I have to play departmental politics. It's much better to be able to blame GCP than to have fingers pointed at scripts that we wrote (sometimes unjustly!) when we have a huge backlog and no time to fix them. I'm looking forward to changing companies, if you can't tell.