voidxMA to FuturologyEnglish · 8 months agoAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comexternal-linkmessage-square53fedilinkarrow-up1194arrow-down14
arrow-up1190arrow-down1external-linkAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comvoidxMA to FuturologyEnglish · 8 months agomessage-square53fedilink
minus-squareCanadaPlus@lemmy.sdf.orglinkfedilinkEnglisharrow-up4·edit-28 months agoWell, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.
Well, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.