COG

Cog is an open-source command-line tool for packaging ML models. Cog gives you a consistent environment to run your model in – for developing on your laptop, training on GPU…

Continue Reading COG

Wu Dao 1.0 (Enlightment 1.0)

In a bid to promote the research and development of China’s own large-scale pretraining models and further explore universal intelligence from a more fundamental perspective, the Beijing Academy of Artificial…

Continue Reading Wu Dao 1.0 (Enlightment 1.0)

Kalendar AI

Kalendar AI sends personalized and exclusive invitations (GPT3) to your ideal customer profiles with time-sensitive bidding on your availability.

Continue Reading Kalendar AI

Mantium

Mantium is a developer tool platform that provides: logging, human in the loop, prompt management, prompt workflows, and security.

Continue Reading Mantium

Spanish-speaking banking agent

BBVA Labs tested OpenAI’s GPT-3 (one of the largest Artificial Intelligence (AI) language models ever created) on ten Spanish customer conversations about banking. Without any previous customization, re-training or transfer…

Continue Reading Spanish-speaking banking agent

Snazzy AI

Snazzy AI - The easiest way to create content for your brand. An AI powered copywriter will do the work for you! Through a combination of best in class tech…

Continue Reading Snazzy AI

NLP Cloud

NLP Cloud provides a text understanding / text generation (NLP) API, for NER, sentiment analysis, text classification, summarization, question answering, text generation, translation, language detection, grammar and spelling correction, intent…

Continue Reading NLP Cloud

HyperCLOVA

South Korea’s Naver Corp. has unveiled its supersized artificial intelligence (AI) platform “HyperCLOVA”, a new Korean-based language model system enabling human brain-like linguistic capacity. During an online conference on Tuesday,…

Continue Reading HyperCLOVA

Switch Transformers by Google Brain

In deep learning, models typically reuse the same parameters for all inputs. Mixture of Experts (MoE) defies this and instead selects different parameters for each incoming example. The result is…

Continue Reading Switch Transformers by Google Brain