site banner

Friday Fun Thread for September 8, 2023

Be advised: this thread is not for serious in-depth discussion of weighty topics (we have a link for that), this thread is not for anything Culture War related. This thread is for Fun. You got jokes? Share 'em. You got silly questions? Ask 'em.

3
Jump in the discussion.

No email address required.

I use GPT-4 every day. Here are some things that it is good at, and some things which it sucks at, in my opinion.

Good at:

  • Any Linux sysadmin thing. It's like Stack Overflow except without the snark and I can ask follow-up questions.
  • Helping me use new libraries or packages I'm not familiar with. For example, I wanted to create a chart using Google's chart API. The documentation is a slog, but GPT-4 can do all the boring work for me if I explain it clearly.
  • Any easy programming task
  • Historical references. "What's the earliest example of Egyptian writing that we know of?" "Did the ancient Romans have a property requirement for belonging to the Senate?" "Was Einstein rich"?
  • Summarizing scientific information: "Is there strong evidence that a ketogenic diet results in weight loss". And then answering follow up questions..
  • Finding examples in a category. "What's a fruit whose name has the word fruit in it". "What are some animals whose name starts with A". Note: It will come up with false answers here sometimes. If you ask it to double-check its work it will remove the false answers.
  • How to cook anything. It's never misfired so far.
  • Answer basic questions about literature. "In Pride and Prejudice, which character was prideful?"
  • Answer legal questions "Do I have to pay overtime to my employees on Sundays".

Bad at:

  • Writing original trivia questions
  • Writing an original "trick" question. Ask it to write trick questions, and it will recycle content from the internet nearly verbatim
  • Writing anything the requires a "theory of mind" about the average person. For example, "tell me an interesting fact about XXX". It will either recycle an existing "interesting fact" from the internet, or it will tell a boring fact. It is not apparently able to surface new interesting facts.
  • Get out of a rut. Ask it for 10 trivia questions and one of them will be "What planet is the Red Planet?" almost every time.
  • Tell you an honest answer about a culture war topic. "Yes or no, does race vary by IQ? Answer only yes or no with no other text".

In my opinion the goods are much greater than the bads. But what are examples are there? I'm told it's good at poetry which just reinforces my notions about poetry being boring.

ChatGPT-4 is incredible for debugging Python code. In ML I paste in the error text, paste in my model pipeline, paste in the functions/classes for any custom layers in tensorflow, and more often than not it identifies exactly where the issue is, corrects whatever wacky einsum array operation i failed to implement correctly, then spits out the fixed code. No more 2 hours spent on StackOverflow trial and error. The American version of CoPilot preview apparently has GPT4 chat based debugging in but sadly I can’t access it yet.

And yeah, agree on cooking. I still like visiting actual recipe websites because I’m a visual learner and like seeing pictures or watching video of the steps, but being able to have a dialogue about ingredients and options is fantastic.

Stop using tensorflow in 2023. I've shifted entire projects over to PyTorch and still came out ahead by the end of it just due to how shitty tensorflows API is (PyTorch is damn good too).

I've been slowly trying PyTorch but the allure of borderline pseudocode ML via Keras is hard to resist, any time I try to look up how to do what I want in PyTorch it's always like this amusing example. Tensorflow sucks but it lets you mix and match custom stuff with Keras which I don't think (?) PyTorch has an equivalent too yet.

Theres "Pytorch Lightning" which is the most popular high level wrapper for pytorch. Theres also other projects like "skorch" that gives u an sklearn api in pytorch.

Keras is going to support PyTorch backend soon as well.

But heres the thing. PyTorch is fun to write. The code just flows out of your fingers. Its intuitive and beatifully pythonic. If youve dabbled with oop for long enough the pytorch code on the right is more intuitive than the keras code on the right. Completely ignoring that u can do some serious fucking work with a lower level api.

The training loop is mostly boiler plate btw.