ihmoguy@alien.topB to Data Hoarder@selfhosted.forumEnglish · 1 year agoIt is time keep hoarding AI models as Chinese censorship hits NYC based Huggingface the biggest AI librarywww.404media.coexternal-linkmessage-square5fedilinkarrow-up13arrow-down10
arrow-up13arrow-down1external-linkIt is time keep hoarding AI models as Chinese censorship hits NYC based Huggingface the biggest AI librarywww.404media.coihmoguy@alien.topB to Data Hoarder@selfhosted.forumEnglish · 1 year agomessage-square5fedilink
minus-squareSwallagoon@alien.topBlinkfedilinkEnglisharrow-up2·1 year agoHow exactly do you backup an AI model anyway? How big is an AI model? Is it just a big zip file? ???
minus-squareMasark@alien.topBlinkfedilinkEnglisharrow-up2·1 year ago You simply download the file and keep it. We’re talking about stuff you can run on your own computer, not services like chatgpt or such. Highly variable. Depends on model size and format. For LLMs, range would be from under 1GB up to a current maximum of ~240GB. It’s a big file of numbers (“weights” or “parameters”) either integers or floating point depending on format.
minus-squarecandre23@alien.topBlinkfedilinkEnglisharrow-up1·1 year agoAnywhere from 1 to several hundred GB. Quantized (compressed), the most popular models are 8-40gb each. LORAs are a lot smaller, but full models take up a lot of space.
How exactly do you backup an AI model anyway? How big is an AI model? Is it just a big zip file? ???
You simply download the file and keep it. We’re talking about stuff you can run on your own computer, not services like chatgpt or such.
Highly variable. Depends on model size and format. For LLMs, range would be from under 1GB up to a current maximum of ~240GB.
It’s a big file of numbers (“weights” or “parameters”) either integers or floating point depending on format.
Anywhere from 1 to several hundred GB. Quantized (compressed), the most popular models are 8-40gb each. LORAs are a lot smaller, but full models take up a lot of space.