Skip to content

Commit

Permalink
Fix typo under torch/_dynamo directory (#110459)
Browse files Browse the repository at this point in the history
Summary:
This PR fixes typo of comments in files under `torch/_dynamo` directory

X-link: pytorch/pytorch#110459
Approved by: https://github.com/colesbury

Reviewed By: PaliC

Differential Revision: D49919270

fbshipit-source-id: 7c688ed0a529f39ea61a545021fc924ede508909
  • Loading branch information
kiszk authored and facebook-github-bot committed Oct 5, 2023
1 parent 6bc2b07 commit 0b5d5bb
Showing 1 changed file with 5 additions and 5 deletions.
10 changes: 5 additions & 5 deletions userbenchmark/dynamo/dynamobench/_dynamo/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -1424,7 +1424,7 @@ def run_node(tracer, node, args, kwargs, nnmodule):
"""
Runs a given node, with the given args and kwargs.
Behavior is dicatated by a node's op.
Behavior is dictated by a node's op.
run_node is useful for extracting real values out of nodes.
See get_real_value for more info on common usage.
Expand Down Expand Up @@ -1593,7 +1593,7 @@ def tensor_always_has_static_shape(
Args:
tensor - the real tensor to evaluate, parameters force a static shape.
is_tensor - internal dynamo check, esentially "is_tensor": target_cls is TensorVariable,
is_tensor - internal dynamo check, essentially "is_tensor": target_cls is TensorVariable,
tensors not in a TensorVariable for whatever reason are forced static.
Returns a tuple, where the first element is the bool of whether or not this tensor should have a static shape.
Expand Down Expand Up @@ -1832,7 +1832,7 @@ def defake(x):


def is_utils_checkpoint(obj):
# Lazy import to avoid circular dependenices
# Lazy import to avoid circular dependencies
import torch.utils.checkpoint

return obj is torch.utils.checkpoint.checkpoint
Expand All @@ -1842,8 +1842,8 @@ def build_checkpoint_variable(**options):
import torch._higher_order_ops.wrap as higher_order_ops
from .variables.higher_order_ops import TorchHigherOrderOperatorVariable

# TODO - This is a temporary sitaution where we have two versions of
# checkpointing implemetation. We will converge on one and remove the other.
# TODO - This is a temporary situation where we have two versions of
# checkpointing implementation. We will converge on one and remove the other.
activation_checkpoint_op = higher_order_ops.tag_activation_checkpoint
if torch._functorch.config.functionalize_rng_ops:
activation_checkpoint_op = higher_order_ops.wrap_activation_checkpoint
Expand Down

0 comments on commit 0b5d5bb

Please sign in to comment.