site banner

Friday Fun Thread for July 4, 2025

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

Tooting my own horn. December 1, 2022 I predicted:

My honest bet is that any student currently in their first year of Law School will be unable to compete with AI legal services by the time they graduate. Certainly not on cost. The AI didn't incur 5-6 figure loans for it's legal training.

Put another way, the AI will be as competent/capable as a first-year associate at a law firm inside 3 years.

This was before GPT4 was on the scene. Reiterated it 3 months ago

And then today I read this nice little headline:

Artificial Intelligence is now an A+ law student, study finds

If they can stop the damn thing from hallucinating caselaw and statutes, it might already be there.

But, let me admit, that if we don't see downward pressure on first-year wages or staffing reductions this year, I missed the meatiest part of the prediction.

There's the counter-argument that AI lawyers will actually stimulate demand for attorneys by making contracts way more complex. I don't buy it, but I see it.

Paralleling @JarJarJedi's story, I tried to outsource some annoying research to Gemini and it just gave me the run around. [TW: Boring ops work] I was trying to find a way to copy a large-ish Cloud SQL MySQL DB from one GCP project to another. Solutions I was given:

  1. Just make a clone in another project (not allowed)
  2. Dump the data to a GCS bucket (I said I didn't want to do that in the original prompt)
  3. Use DMS to transfer the data to an empty Cloud SQL instance in another project (sounded promising, was my original thought... but you can't select a DMS instance in another project from the drop-down)
  4. Just use the public IP address of the other Cloud SQL instance as the target (and send my prod DB data over the internet? pretty sure the security team would kill me)
  5. Just use the private IP address of the other Cloud SQL instance (...what? it's in another VPC in another project)
  6. Just peer the VPCs (hmmm.... but no, Cloud SQL instance interfaces actually exist in a "private services subnet" which is part of a hidden Google-controlled VPC which gets automatically peered to the customer's VPC in the same project (A <-> B), and GCP does not support transitive VPC peering (A <-> B <-> C))
  7. Just peer the hidden VPCs in each project directly (they're Google-controlled so we can't create peerings)
  8. Peer the customer's VPC in the source project with the Cloud SQL VPC in the target project (that doesn't make sense)
  9. Just dump the database into a bucket (sigh)

In the end I probably could have just spent 30-45 minutes reading the docs and figured out what my real options were rather than spending severally hours trying half-baked solutions. I just use Gemini for short scripts, text editing operations, and boilerplate now.

[PS. I think we can use PSC to solve the problem above]

I copied your post into o4-mini ChatGPT, without the last two paragraphs and the PSC idea, and this was spit out after a minute:

—————————

Here’s a proven, zero-Downtime–capable way to copy your Cloud SQL for MySQL database from Project A into Project B without dumping to GCS or exposing it publicly: use Database Migration Service (DMS) in the destination project, treating the source instance as a “self-managed” MySQL endpoint. This leverages private connectivity (VPC peering) so data never traverses the public Internet.

  1. Prepare your source Cloud SQL instance • Enable binary logging & GTID, and use row-based replication on your source: set log_bin, binlog_format=ROW, gtid_mode=ON, and retain logs (expire_logs_days or binlog_expire_logs_seconds) for at least a week. • Create a replication user with REPLICATION SLAVE, REPLICATION CLIENT, SELECT, SHOW VIEW, EXECUTE, TRIGGER, etc., per your migration type .

  2. Enable DMS in the destination project • In Project B, enable the Database Migration Service API and grant yourself the Database Migration Admin role.  • Ensure the DMS service account has compute.networkAdmin so it can set up peering. 

  3. Set up private connectivity (VPC peering) • In Project B, create a Private Connectivity Configuration under DMS → Private connectivity. Select the VPC from Project A that holds your source instance’s private IP; DMS will automatically peer its service-network VPC into that network. 

  4. Create your Source Connection Profile • Still in Project B’s DMS UI, go to Connection Profiles → Create Profile → Source. • Engine: MySQL • Classification: “Self-managed MySQL” (even though it’s Cloud SQL)—this bypasses the same-project restriction . • Host: the private IP (or public IP + authorized network) of your Cloud SQL source. • Credentials: use the replication user you created. 

  5. Create & run the Migration Job • Go to Migration Jobs → Create Migration Job in Project B.

    1. Name your job.
    2. Source: select the connection profile from step 4.
    3. Destination: choose an existing or let DMS create a new Cloud SQL instance in Project B.
    4. Connectivity: pick Private IP and select the private connectivity config you made in step 3.
    5. Type: choose One-time (initial load only) or Continuous (CDC + initial load).
    6. Test, then Start the job. 

DMS will do an initial data copy over the peered VPC, then (if you chose Continuous) apply changes in near real-time.

  1. Cut over to the new instance • Once the initial load is complete and lag is minimal, stop writes on the source, then in the DMS job click Promote. This switches your application to the target instance in Project B. 

Why this works • No GCS buckets or manual dumps needed.
• Data never leaves Google’s private network—no public IP exposure.
• Cross-project works seamlessly by treating the source as self-managed and running DMS in the target project.
• Scales to multi-TB databases with minimal downtime (CDC mode)

———————————-

Is it wrong?

In Project B, create a Private Connectivity Configuration under DMS → Private connectivity. Select the VPC from Project A that holds your source instance’s private IP

IIRC Gemini gave me this too. It's correct except this one line, where it hallucinates that it's possible to select a VPC from another project (you can't), and therefore the whole set of instructions are useless.

This btw is an extremely common failure mode in my opinion. You ask it to do something complex, and it builds a very nice way to do it, except there's one link in the chain that it completely invented out of the thin air. And it totally worked just like that if that link existed, except it does not. It could be an API method, a tool, a UI option, I've encountered a number of things - it all looks very neat, except for one detail that completely ruin the whole thing. And if you note about it, it cheerfully congratulates you on discovering it and even more cheerfully explains why it has always known this doesn't work and can't work. If a person kept doing this to me I'd be infuriated but you can't really blame a bunch of numbers.