-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
readArray methods #156
readArray methods #156
Conversation
* | ||
* @group slow | ||
*/ | ||
def flatArray: Future[Array[closure.JvmValue]] = { | ||
flatBuffer.intransitiveMap(closure.valueType.memory.toArray).run | ||
} | ||
|
||
/** Recursive function to reshape a flat array into a multi-dimensional one | ||
*/ | ||
private[Tensors] def reshapeArray(a:Array[_], b:Array[Int]): Array[_] = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Needs a check that the product of the dimensions in b
equals the length of a
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
also renaming those variables to something informative
* | ||
* @group slow | ||
*/ | ||
def flatArray: Future[Array[closure.JvmValue]] = { | ||
flatBuffer.intransitiveMap(closure.valueType.memory.toArray).run | ||
} | ||
|
||
/** Recursive function to reshape a flat array into a multi-dimensional one | ||
*/ | ||
private[Tensors] def reshapeArray(array:Array[_], shape:Array[Int]): Array[_] = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When shape
is Array.empty
, the tensor is a scalar value, not an Array
.
This method should either throw an exception for scalar value, or change the return type.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Right! If scalars are considered distinct from 1d arrays, then I think it makes sense that the function readArray
would throw an exception.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If scalars are considered distinct from 1d arrays, why 1D arrays and 2D array shares the same method? Considering the Array[_]
return type is not actually usable without an anInstanceOf
.
There are three options:
- Create two methods for scalars and 1+ dimensional arrays, respectively
- Create one
Any
methods for both scalars and 1+ dimensional arrays - Create n methods for different rank of tensors, respectively
What do you think?
|
||
/** Recursive function to reshape a flat Seq into a multi-dimensional one | ||
*/ | ||
private[Tensors] def reshapeSeq(a:Seq[_], b:Array[Int]): Seq[_] = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When shape
is Array.empty
, the tensor is a scalar value, not a Seq
.
This method should either throw an exception for scalar value, or change the return type.
* @group slow | ||
*/ | ||
|
||
def toArray = {flatArray.map {z=> reshapeArray(z, shape)}} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Since toArray
or toSeq
in Scala collection library is not asynchronous, the name toArray
will surprise people if it returns a Future
.
Maybe we should rename it to readArray
, which seems like a rather expensive operation.
You're right that the user would need to know the number of dimensions and cast with (I haven't touched |
The best solution is changing But it requires huge number changes in the code base. At the moment I think |
Do you think it is worth having to parametrize What do you not like about the |
Ah I just saw your edited comment. I think |
Eventually we will need type parameters for |
In practice the length of a tensor parameter may vary in multiple calls to a function, e.g. different batch size when running a neural network. However the number of dimensions of a tensor is usually stable. It is reasonable to statically type it. |
If we will eventually add the type parameter for tensor, the |
Right. Makes sense, though many changes necessary and some issues to solve (e.g. the function For now should we go with the (probably temporary) solution and address those issues as part of #104? |
If in the future it will become the following type signature: trait Tensor[A] {
def read: Future[A]
} Then I thought a sane type signature for now could be: trait Tensor {
def read: Future[Any]
} Do you think the |
Which type is preferred, |
Note that the approach of n-dimensional |
We had create a view for |
If the main purpose of |
or were you asking about the underlying parametrization of |
There are some issues to address in this PR:
|
|
I can merge this PR for now. |
Sure! I've subscribed to notifications on #104. |
The incoming change will break backward compatibility.... It's OK since the version number is still pre 1.0. |
Please remove |
methods to read Tensor into Array and Seq update README
Merged. Thank you! @evanhaldane |
fixes /issues/130