LughMA to FuturologyEnglish · 1 year agoNVIDIA's Eos supercomputer can train a 175 billion parameter GPT-3 model in under four minuteswww.engadget.comexternal-linkmessage-square2fedilinkarrow-up120arrow-down10
arrow-up120arrow-down1external-linkNVIDIA's Eos supercomputer can train a 175 billion parameter GPT-3 model in under four minuteswww.engadget.comLughMA to FuturologyEnglish · 1 year agomessage-square2fedilink
minus-squareOisteink@feddit.nllinkfedilinkEnglisharrow-up1·1 year agoJust checking in to say they are still there - so many rascals showing off rigs these days
Just checking in to say they are still there - so many rascals showing off rigs these days