MUM (Multitask United Model)
Google’s MUM (Multitask United Model) is a language model built on a
BERT. BERT, created in 2019, is a powerful Natural Language Processing
(NLP) model for search engines. When first to market, BERT unto itself was
a search engine breakthrough. But, according to Google, MUM is 1000x more
powerful than BERT which is a game-changer in the AI field.
What Makes MUM Better than BERT?
Google is confident that MUM gives Google better capacity to understand
language. Many languages. Therefore, according to Google, MUM can help a
search engine understand language better and serve up more relevant results
faster than ever before.
MUM & Multitasking
Much of MUM’s power lies in its ability to multitask. Instead of
having to complete one task before starting another, MUM can tend to many tasks
at one time. Basically, this means that MUM can read text, understand the
meaning of what it reads, formulate a deep knowledge about an issue or topic,
use audio/visual for learning, gain input from more than 75 languages then
translate its key findings into content that is multi-layered. This it
can do all at one time.
The Purpose of MUM AI
The purpose of creating MUM is to help users get more information with fewer
searches. Finding quick answers has become a mainstay for people all over
the globe. Google’s new MUM AI will likely keep Google as the number one
search engine in the world for years to come.
Elaine Allan, BA. MBA
Technology & Business Blogger
Vancouver, BC, Canada