First “modern and powerful” open source LLM?

Key features

  • Fully open model: open weights + open data + full training details including all data and training recipes
  • Massively Multilingual: 1811 natively supported languages
  • Compliant Apertus is trained while respecting opt-out consent of data owners (even retrospectivey), and avoiding memorization of training data
    • Sonalder@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      1 month ago

      Open Source is a way to create but is not limited to only software but many different things. LLMs are software. Most open source LLMs are using Open washing to label themselves as Open Source, however it is not. The importance in Open Source is being able to study how it was made and most of open models have closed training data-sets and training method. Apertus is trully open in the sense that they published Open Data and full training details.

      You have the right to be bother by “AI” but let Open Source enthousiasts being… well, enthousiasts when in a field of Open Washing someone created something trully Open Source to the point of sharing it in an Open Source community on a FOSS plateform.

      Edit : Minor corrections

  • snikta@programming.devOP
    link
    fedilink
    arrow-up
    0
    ·
    1 month ago

    A fully open-source LLM

    As a fully open language model, Apertus allows researchers, professionals and enthusiasts to build upon the model and adapt it to their specific needs, as well as to inspect any part of the training process. This distinguishes Apertus from models that make only selected components accessible.

    “With this release, we aim to provide a blueprint for how a trustworthy, sovereign, and inclusive AI model can be developed,” says Martin Jaggi, Professor of Machine Learning at EPFL and member of the Steering Committee of the Swiss AI Initiative. The model will be regularly updated by the development team which includes specialized engineers and a large number of researchers from CSCS, ETH Zurich and EPFL.