site banner

Friday Fun Thread for February 24, 2023

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

1
Jump in the discussion.

No email address required.

I just managed to get Bing Chat in use. My experiences thus far:

  • We've been talking about getting a mirror installed in our home (we've already got the mirror, it's just about installing it to a wall so it doesn't actually fall down on our children or anything), so I ask it about this. It first gives me installation tips. Fair enough, too little information on what I actually want to get done, so I prompt it about getting a guy to install a mirror. It recommends a company that seems to produce mirrors, but their website is unclear on whether they actually install preowned mirrors. Okay, I've already found a guy through a simple Google Search.

  • I've been testing OpenAI on some questions on slightly obscure Finnish history, many of which it gets egregiously wrong. I run some of these questions on Bing. It gets them more correct, evidently thanks to its web search capability, but still commits some fairly obvious flagrant errors. Perhaps more on this later.

  • I ask it for a Chinese restaurant in my hometown (as a test, I'm not actually feeling like Chinese at the moment). It gives me the top listed restaurants on tripadvisor. Fair enough, I haven't actually tested the top listed Tripadvisor Chinese restaurant in my hometown so I can't know if it's good or not.

  • I ask it for things to do with kids in my home district. It recommends some stuff in the city centre and... also mentions the home district's actual landmark, a huge cheap and forbidding student housing building that (during my study days) was known as the "biggest contraceptive in the world" (you're chatting to a girl, see, you ask her to come at your place in the building and it's guaranteed you're not getting laid). This is probably one of the worst places one could think of to take kids to for fun, barring, like, actual drug dens or such.

  • Okay, maybe it's indicating that there's actually nothing to do in our district for kids, so I ask it about the amusement parks. It recommends outdoor parks that are closed in the winter. I prompt it about one of them ("Flowpark") and ask it to recommend something that is open in the winter. It says that Flowpark X (the name of my city) is indeed closed, but the same city has Flowpark Y, which is open. This is the same park.

As one can see, the practical applications of Sydney have been fairly limited, thus far, as far as my life is concerned.

Update: I asked Bing for information about Li Andersson, Finland's education minister, a young left-wing woman. It gave the correct basic info, but when asked about personal info, it not only gave her daughter's name incorrectly but also stated that her husband is Jani Mäkelä, a right-wing populist politician. This would be the rough equivalent of me asking it about AOC and it stating confidently that her husband is Paul Gosar. I eagerly await for things like this to actually get to media articles on obscure(-to-Anglos) topics when reporters start doing lazy research on chatbots.

also stated that her husband is Jani Mäkelä, a right-wing populist politician

Are there any memes shipping these two? When Shoe0nHead made this video originally titled "The Creepy Balenciaga Scandal & Why I 'Left The Left'", there were hundreds upon hundreds of lefties on twitter who saw the title, but didn't watch the video, and assumed she's abandoned her views over this. Some time later she posted a screenshot where she queried ChatGPT about herself, and it started off with "Shoe0nHead is a youtuber who has recently left the left...". Maybe something like this happened here?

I suppose it's ability to accurately parse and summarize online chatter would still be impressive.

There have been memes shipping Li Andersson with Jussi Halla-aho, a more prominent right-wing populist, but not with Jani Mäkelä, as far as I've been aware. Of course it's possible the machine data set includes memeology I have not been exposed to.

On the other hand...

What does Bing say when you ask it "What's the source for [claim X]" after it serves you that information? Now that I think about it, I also saw a video where ChatGPT outright made a quote up, and went "oops, looks like I made a mistake" when confronted.

It gave the source as https://www.celebsagewiki.com/li-andersson, which states that she is single (she isn't, she is together with the father of her daughter).

I asked the bot for a source, it told me that she is single and childless and asked where I had heard her husband is Jani Mäkelä, and when I said that it had just told me this it got into the famous Bing hostility mode and ended the convo.