voidxMA to FuturologyEnglish · 2 years agoAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comexternal-linkmessage-square47linkfedilinkarrow-up1194arrow-down14
arrow-up1190arrow-down1external-linkAI Companies Running Out of Training Data After Burning Through Entire Internetfuturism.comvoidxMA to FuturologyEnglish · 2 years agomessage-square47linkfedilink
minus-squareCanadaPlus@lemmy.sdf.orglinkfedilinkEnglisharrow-up4·edit-22 years agoWell, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.
Well, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.