Google LLC Patent, Omnidirectional Transformers, Apr 21
Summary
Google LLC has been granted US Patent 12608594B2 for machine-learned attention models featuring omnidirectional processing. The patent, titled 'Omnidirectional Representations from Transformers' (OMNINET), describes neural network architectures where each token can attend to all tokens across the entire network rather than maintaining strictly horizontal receptive fields. The application (17592796) was filed on February 4, 2022, and the patent contains 19 claims.
“In example models described in the present disclosure, instead of maintaining a strictly horizontal receptive field, each token is allowed to attend to all tokens in some or all of the other tokens across the entire network.”
About this source
GovPing monitors USPTO Patent Grants - AI & Computing (G06N) for new telecom & technology regulatory changes. Every update since tracking began is archived, classified, and available as free RSS or email alerts — 32 changes logged to date.
What changed
Google LLC has been granted US Patent 12608594B2 for 'Machine-learned attention models featuring omnidirectional processing' (OMNINET). The patent covers neural network architectures where tokens attend to all other tokens across the entire network, deviating from traditional horizontal receptive fields.
Technology companies developing transformer-based models, AI research organizations, and firms implementing attention mechanisms should review the patent claims (19 total) for potential licensing implications or freedom-to-operate considerations in the machine learning space.
Archived snapshot
Apr 23, 2026GovPing captured this document from the original source. If the source has since changed or been removed, this is the text as it existed at that time.
Machine-learned attention models featuring omnidirectional processing
Grant US12608594B2 Kind: B2 Apr 21, 2026
Assignee
GOOGLE LLC
Inventors
Yi Tay, Da-Cheng Juan, Dara Bahri, Donald Arthur Metzler, Jr., Jai Prakash Gupta, Mostafa Dehghani, Phillip Pham, Vamsi Krishna Aribandi, Zhen Qin
Abstract
Provided are machine-learned attention models that feature omnidirectional processing, example implementations of which can be referred to as Omnidirectional Representations from Transformers (OMNINET). In example models described in the present disclosure, instead of maintaining a strictly horizontal receptive field, each token is allowed to attend to all tokens in some or all of the other tokens across the entire network.
CPC Classifications
G06N 3/045 G06N 3/10
Filing Date
2022-02-04
Application No.
17592796
Claims
19
Related changes
Get daily alerts for USPTO Patent Grants - AI & Computing (G06N)
Daily digest delivered to your inbox.
Free. Unsubscribe anytime.
Source
About this page
Every important government, regulator, and court update from around the world. One place. Real-time. Free. Our mission
Source document text, dates, docket IDs, and authority are extracted directly from USPTO.
The summary, classification, recommended actions, deadlines, and penalty information are AI-generated from the original text and may contain errors. Always verify against the source document.
Classification
Who this affects
Taxonomy
Browse Categories
Get alerts for this source
We'll email you when USPTO Patent Grants - AI & Computing (G06N) publishes new changes.
Subscribed!
Optional. Filters your digest to exactly the updates that matter to you.