Donate to e Foundation | Murena handsets with /e/OS | Own a part of Murena! Learn more

Commit 0480af9a authored by Lev Proleev's avatar Lev Proleev
Browse files

LSTM: require input layer norm weights to be omitted in case CIFG is used.

In case of CIFG LSTM, input layer norm weights are not used in the
computation.

Bug: 129126572
Test: mma
Change-Id: I57bc578606132a2c44c71ab63dd7645dcc001302
parent 7b277acb
Loading
Loading
Loading
Loading
+1 −1
Original line number Diff line number Diff line
@@ -512,7 +512,7 @@ b83317b66721241887d2770b5ae95fd5af1e77c5daa7530ecb08fae8892f2b43 android.hardwar
92714960d1a53fc2ec557302b41c7cc93d2636d8364a44bd0f85be0c92927ff8 android.hardware.neuralnetworks@1.2::IExecutionCallback
36e1064c869965dee533c537cefbe87e54db8bd8cd45be7e0e93e00e8a43863a android.hardware.neuralnetworks@1.2::IPreparedModel
e1c734d1545e1a4ae749ff1dd9704a8e594c59aea7c8363159dc258e93e0df3b android.hardware.neuralnetworks@1.2::IPreparedModelCallback
209a5ee694b94328afb2af2768f1fe6a69148e2cbb85ec3c340a36eed818c697 android.hardware.neuralnetworks@1.2::types
9b3963253e521cca19fd81aeca83aee6dcfe3bdf2805c07cb2d3f64381709b71 android.hardware.neuralnetworks@1.2::types
cf7a4ba516a638f9b82a249c91fb603042c2d9ca43fd5aad9cf6c0401ed2a5d7 android.hardware.nfc@1.2::INfc
abf98c2ae08bf765db54edc8068e36d52eb558cff6706b6fd7c18c65a1f3fc18 android.hardware.nfc@1.2::types
4cb252dc6372a874aef666b92a6e9529915aa187521a700f0789065c3c702ead android.hardware.power.stats@1.0::IPowerStats
+5 −2
Original line number Diff line number Diff line
@@ -1188,8 +1188,11 @@ enum OperationType : int32_t {
     *   value if the recurrent projection layer exists, and should otherwise
     *   have no value.
     * * (API level >= 29) The four layer normalization weights either all have
     *   values or none of them have values. Layer normalization is used when
     *   values are present.
     *   values or none of them have values. Additionally, if CIFG is used,
     *   input layer normalization weights tensor is omitted and the other layer
     *   normalization weights either all have values or none of them have
     *   values. Layer normalization is used when the values of all the layer
     *   normalization weights are present.
     *
     * References:
     *