Old profile: [email protected]

Mastodon: [email protected]

  • 0 Posts
  • 4 Comments
Joined 2 years ago
cake
Cake day: June 28th, 2024

help-circle

  • Luccus@feddit.orgtoComic Strips@lemmy.worldTelepathy Club (OC)
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    26 days ago

    I love this. It leaves just enough out so that it’s not immediately obvious what’s happening, but gives enough clues so that you can figure it out as you scroll.

    It builds up nicely, and once you understand it, it leaves you feel clever & very fulfilled as a reader.

    You’ve basically figured out Valve’s (the video game company) definition of “fun” for a short comic strip. You should be proud of that! Also love the style. I hope to see more whenever you find inspiration.


  • I propose the body temperature of an average opossum as the fixed point for 100 because they are cute as heck. We shall call this unit Possigrade. And anything above 100 Possigrade should be called the ‘rabies zone’ and 0 Possigrade should correspond to 8°C, as this feels very cold when dressed inappropriately. In addition, there is now the Bakers Possigrade, where 100 corresponds to 27°C, as this is the temperature at which sourdough bread rises by about ⅓ in 5.5 hours.

    But seriously: Celsius is fine. On Earth, we are primarily interested in water at atmospheric pressure. Too many things contain water (pipes, food, paint, etc) and they react differently at 0 °C than at 4 °C. For this reason, we deliberately avoid using water in applications that are regularly exposed to sub-zero temperatures. Water is simply everywhere, so 0 °C and 100 °C are important tipping points for general use.


  • TLDR: The result of current LLMs will be very bandlimited and one-directional.

    I hope that means something to you, because otherwise I’m going to try to explain this very specific thing, and I’m afraid I might not be able to express it in very understandable terms (sorry):

    Firstly, one-directionality: when a human wants to write a story, we usually think about the plot twist beforehand and then pave the way by hinting at the upcoming twist without giving too much away. It’s just nice when a first time reader is surprised, but struggles on a second time how they missed all the obvious clues.

    This process requires a lot of back-and-forth while writing. Humans do this naturally. LLMs and other transformer networks have a huge problem with this. I often hear LLMs referred to as text prediction machines. This is not entirely accurate, but a similar enough. And to keep with this analogy: text prediction doesn’t really work backwards to suggest a better start to the sentence, does it? LLMs tend to take a path, from start to finish, even in great detail, but that’s it. There’s no setup. It’s very flat writing.

    Secondly, bandlimiting: Over time LLMs tend to mush different characterizations and continuity into a smooth paste, leaving little grit to it. I really struggle to not say the word derivative (like in math). But LLMs just write average characters who do average things in an average way. And then spell out how everything was totally unpredictable, important and meaningful, while using superficially eloquent language. Nothing just is everything serves as. It’s a poor writing style that often misses the appropriate tone, trying to sound sophisticated.