Effective Approaches to Attention-based Neural Machine Translation

Created on 2022-12-13T06:58:11-06:00

Return to the Index

This card pertains to a resource available on the internet.

This card can also be read via Gemini.

Hidden states from neurons going in to the encoder are also provided to the encoder. However there is a layer which assigns a weight to each hidden context and those hidden contexts are all merged proportional to this weight before being fed in to the encoder-decoder layer.

Conjecture: they are finally starting to approximate the dendrite connections in brains. Clusters of cortical neurons whos purpose is to contextually inhibit firing of their columns.

Search help: "neural transformers" paper.