• 0 Posts
  • 56 Comments
Joined 1 year ago
cake
Cake day: July 16th, 2023

help-circle






  • public transit is how you get around without a car

    Except later in the evening when many lines stop or get very infrequent. Catching that late movie? Walk home.

    Getting the kids in public transportation in a hassle. Teaching them to bike and have a safe environment for them to bike in is more fun.

    Cargo bikes to move groceries, little kids and other stuff is easy enough. Getting those groceries on public transportation is not that easy.

    And a bike is usually much faster to go over one or two stops instead of waiting for the bus.

    Both public transportation and bikes have their use.





  • It does remind me of that recent Joe Scott video about the split brain. One part of the brain would do something and the other part of the brain that didn’t get the info because of the split just makes up some semi-plausible answer. It’s like one part of the brain does work at least partially like an LLM.

    It’s more like our brain is like a corporation, with a spokesperson, a president and vice president and a number of departments that with semi-independently. Having an LLM is like having only the spokesperson and not the rest of the work force in that building that makes up an AGI.


  • they have to provide an answer

    Indeed. That’s the G in chatGPT. It stands for generative. It looks at all the previous words and “predicts” the most likely next word. You could see this very clearly with chatGPT-2. It just generated good looking nonsense based on a few words.

    Then you have the P in chatGPT, pre-trained. If it happens to have received training data on what you’re asking, that data is shown. It it’s not trained on that data, it just uses what is more likely to appear and generates something that looks good enough for the prompt. It appears to hallucinate, lie, make stuff up.

    It’s just how the thing works. There is serious research to fix this and a recent paper claimed to have a solution so the LLM knows it doesn’t know.