• 22 Posts
  • 110 Comments
Joined 1 year ago
cake
Cake day: August 9th, 2023

help-circle














  • I spent my childhood in Brooklyn (just a bridge away from Manhattan) just before the internet was a thing, and it seems pretty normal relative to what friends from other places describe. In fact, better in some ways. It was always easy to get a group of kids together to do whatever. We had pickup baseball (usually stickball), basketball, hide-and-seek and other games. There were 2 nice parks and several pocket parks in easy walking distance. Most of us had and rode bikes everywhere. A lot of my friends went to different schools (because of the density you might walk 3 blocks to the elementary school north of you, or 4 to the one south), so there were always new pools of people to interact with.

    Though I moved away my sister still lives there and has kids of her own, and it seems pretty much the same now as it was then. Since the density of the place hasn’t changed too much it actually seems more the same than where I live now, which has significantly changed in terms of population and traffic (and is heavily car-dependent) in just the last 15 years.


  • will_a113@lemmy.mltoAI@lemmy.mlDo I understand LLMs?
    link
    fedilink
    English
    arrow-up
    15
    ·
    2 months ago

    The critical thing to remember about LLMs is that they are probabilistic in nature. They don’t know facts, they don’t reason, they don’t evaluate. All they do is take your input string, split that string into tokens that are about 3-4 characters long, and then go back into their vast, vast, pretrained database and say “I have this series of tokens. In the past when similar sets of tokens were given, what were the tokens that were most likely to be associated with them?” It will then construct the output string one token at-a-time (more sophisticated models can do multiple tokens at once so that words, phrases and sentences might hang together better) until the output is complete (the probability of the next token being relevant drops below some threshold value) or your output limit is reached.