Open

Dimitris Bertsimas named vice provost for open learning

Dimitris Bertsimas PhD ’88 has been appointed vice provost for open learning...

LG Unveils 7.8B Open Source Model ‘ExaOne 3.0’… “Outperforms Big Tech Models of Same Class”

LG has released its recent model, 'EXAONE 3.0', as open source. It was emphasized that the small language model (SLM) with 7.8 billion parameters outperforms similarly sized global open source models reminiscent of 'Rama 3.1...

The Most Powerful Open Source LLM Yet: Meta LLAMA 3.1-405B

Memory Requirements for Llama 3.1-405BRunning Llama 3.1-405B requires substantial memory and computational resources:GPU Memory: The 405B model can utilize as much as 80GB of GPU memory per A100 GPU for efficient inference. Using Tensor...

Acrylic, Jonathan ‘ALLM’ Tops Open Source ‘Tiger Leaderboard’

Artificial intelligence (AI) specialist Acrylic (CEO Park Oe-jin) announced on the first that its large language model (LLM) 'Jonathan Allm' ranked first within the open source category on the 'Tiger Leaderboard' operated by Weight...

Google launches ultra-small open source model ‘Gemma 2 2B’… “Outperforms GPT-3.5 and Mistral 8x7B”

Google has open-sourced an on-device artificial intelligence (AI) model with 2.6 billion parameters. Google claims that the model outperforms larger models comparable to OpenAI’s ‘GPT-3.5’ and Mistral’s ‘Mixtral 8x7B’. VentureBeat reported on the thirty first...

The Only Guide You Must Superb-Tune Llama 3 or Any Other Open Source Model

Superb-tuning large language models (LLMs) like Llama 3 involves adapting a pre-trained model to specific tasks using a domain-specific dataset. This process leverages the model's pre-existing knowledge, making it efficient and cost-effective in comparison...

Meta, ‘Rama 3.1’ released… “Beyond the strongest open source, performance matches GPT-4 and Claude”

Meta has released the most important open source AI model ever with 405 billion parameters, 'Rama 3.1'. It emphasized that this model is comparable to the present best performing models akin to OpenAI's 'GPT-4o'...

Apple Launches Highest-Performance Open Source SLM ‘DCLM’… “Maximizing Performance with Data Curation”

Apple has released a brand new open source Small Language Model (SLM) that it claims is its strongest, trained on high-quality datasets through data curation. VentureBeat reported on the nineteenth (local time) that Apple has...

Recent posts

Popular categories

ASK ANA