Meta AI is set to make its large language model – also known as a natural language processing (NLP) system - Open Pretrained Transformer (OPT-175B) publicly available.
OPT-175B has 175 billion parameters trained on publicly available datasets, and in a first for a language technology system of its size, the release includes pretrained models along with the code with which to train and use them.
In the past few years, NLPs with 100 billion+ parameters, have served to transform NLP and AI research.
Until now, though, access to such systems has chiefly been limited to paid APIs for the public, while full access has remained limited to highly resourced labs. This has hindered researchers’ ability to gain insight into how and why large language models work.
To prevent misuse, Meta is releasing their model under a non-commercial licence to focus on research use cases. Model access will be granted to academic researchers affiliated with organisations in government, civil society, and academia.
Meta posits that a far broader segment of the AI community needs access to these models to conduct research that is reproducible and will collectively drive the field forward.
With the release of OPT-175B and smaller-scale baselines, they hope to increase the diversity of voices defining the ethical considerations of the technology.
Recent Stories