• ☆ Yσɠƚԋσʂ ☆@lemmy.mlOP
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    5 months ago

    For sure, and in a lot of use cases you don’t even need a really big model. There are a few niche scenarios where you require a large context that’s not practical to run on your own infrastructure, but in most cases I agree.