You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
According to the paper, the smell is described as follows:
Problem
If the developer initializes an array using tf.constant() and tries to assign a new value to it in the loop to keep it growing, the code will run into an error. The developer can fix this error by the low-level tf.while_loop() API. However, it is inefficient coding in this way. A lot of intermediate tensors are built in this process.
Solution
Using tf.TensorArray() for growing array in the loop is a better solution for this kind of problem in TensorFlow 2.
Impact
Efficiency, Error-proneness
Example:
### TensorFlow
import tensorflow as tf
@tf.function
def fibonacci(n):
a = tf.constant(1)
b = tf.constant(1)
- c = tf.constant([1, 1])+ c = tf.TensorArray(tf.int32, n)+ c = c.write(0, a)+ c = c.write(1, b)
for i in range(2, n):
a, b = b, a + b
- c = tf.concat([c, [b]], 0)+ c = c.write(i, b)- return c+ return c.stack()
You can find the code related to this smell in this link:
Hello!
I found an AI-Specific Code smell in your project.
The smell is called: TensorArray Not Used
You can find more information about it in this paper: https://dl.acm.org/doi/abs/10.1145/3522664.3528620.
According to the paper, the smell is described as follows:
Example:
You can find the code related to this smell in this link:
nematus/nematus/mrt_utils.py
Lines 230 to 250 in 6419dbb
I also found instances of this smell in other files, such as:
File: https://github.com/EdinburghNLP/nematus/blob/master/nematus/layers.py#L122-L132 Line: 127
File: https://github.com/EdinburghNLP/nematus/blob/master/nematus/layers.py#L127-L137 Line: 132
.
I hope this information is helpful!
The text was updated successfully, but these errors were encountered: