Listen to this story
|
US-based Software-as-a-service (SaaS) giant Salesforce has recently introduced a XGen-7B, a series of 7B Large Language Models (LLMs) trained on 8K input sequence length. The models are released under the Apache 2.0 licences.
On standard NLP benchmarks, XGen achieves comparable or better results when compared with other open-source LLMs such as Falcon, LLaMA, Redpajama, and OpenLLaMA, among others.
So far, models such as those listed above have been trained with a maximum of 2K token sequence length, which is a key limitation in modeling long sequences.
“In light of this, we train a series of 7B LLMs named XGen with standard dense attention on up to 8K sequence length for up to 1.5T tokens. We also fine tune the XGen models on public-domain instructional data, creating their instruction-tuned counterparts (XGen-7B-inst),” Salesforce said in a blog post.
Recently, Salesforce also announced the launch of Salesforce ‘Starter’ for Micro, Small and Medium Enterprise (MSME)businesses in India.
Starter is an easy-to-use CRM that includes sales, service and email outreach tools in one suite, helping companies get started – so they have the tools to improve customer experiences, reduce costs, and drive revenue.
Earlier this month, Salesforce Ventures announced that it is expanding its Generative AI Fund, doubling the USD 250 million fund to USD 500 million as part of its continuing commitment to bolster the AI startup ecosystem and spark the development of responsible generative AI.