Attention Neural Networks Linear Units, Google, Apr 21
Summary
The USPTO granted US Patent 12,608,616 B2 to Google LLC on April 21, 2026, covering attention neural networks with linear units. The patent, invented by Noam M. Shazeer, includes 21 claims across CPC classifications G06N 3/082, 3/048, 3/045, 3/088, and 3/084, relating to methods and systems for performing machine learning tasks using attention mechanisms.
“Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output.”
About this source
GovPing monitors USPTO Patent Grants - AI & Computing (G06N) for new telecom & technology regulatory changes. Every update since tracking began is archived, classified, and available as free RSS or email alerts — 20 changes logged to date.
What changed
The USPTO issued US Patent 12,608,616 B2 to Google LLC covering methods, systems, and apparatus for attention neural networks with linear units. The patent describes attention layers comprising an attention sub-layer and a feed-forward sub-layer that applies element-wise multiplication between vectors generated from two different linear transformations performed on the same attended layer input.
Affected parties including technology companies developing neural network architectures should review the patent claims to assess potential licensing needs if developing similar attention mechanisms with linear transformations and element-wise multiplication operations.
Archived snapshot
Apr 22, 2026GovPing captured this document from the original source. If the source has since changed or been removed, this is the text as it existed at that time.
Attention neural networks with linear units
Grant US12608616B2 Kind: B2 Apr 21, 2026
Assignee
Google LLC
Inventors
Noam M. Shazeer
Abstract
Methods, systems, and apparatus, including computer programs encoded on a computer storage medium, for performing a machine learning task on a network input to generate a network output. In one aspect, one of the systems includes an attention neural network configured to perform the machine learning task, the attention neural network including one or more attention layers, each attention layer comprising an attention sub-layer and a feed-forward sub-layer that applies an element-wise multiplication between two vectors generated as a result of two different linear transformations performed on the same attended layer input.
CPC Classifications
G06N 3/082 G06N 3/048 G06N 3/045 G06N 3/088 G06N 3/084
Filing Date
2025-10-16
Application No.
19360280
Claims
21
Mentioned entities
Parties
Related changes
Get daily alerts for USPTO Patent Grants - AI & Computing (G06N)
Daily digest delivered to your inbox.
Free. Unsubscribe anytime.
Source
About this page
Every important government, regulator, and court update from around the world. One place. Real-time. Free. Our mission
Source document text, dates, docket IDs, and authority are extracted directly from USPTO.
The summary, classification, recommended actions, deadlines, and penalty information are AI-generated from the original text and may contain errors. Always verify against the source document.
Classification
Who this affects
Taxonomy
Browse Categories
Get alerts for this source
We'll email you when USPTO Patent Grants - AI & Computing (G06N) publishes new changes.
Subscribed!
Optional. Filters your digest to exactly the updates that matter to you.