• blackfire@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    1 year ago

    So it was a perf test of a 1b token size model not the full 3.7T that get3 is trained with. I mean great. They are showing improvement but this is just a headline grabber they haven’t done anything actually useful here.

    • Oisteink@feddit.nl
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      Just checking in to say they are still there - so many rascals showing off rigs these days