Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

attempt absolute insanity #29

Open
timothyyu opened this issue May 8, 2018 · 3 comments
Open

attempt absolute insanity #29

timothyyu opened this issue May 8, 2018 · 3 comments
Assignees
Labels
help wanted Extra attention is needed hypothetical invalid This doesn't seem right question Further information is requested

Comments

@timothyyu
Copy link
Owner

https://keras.io/layers/convolutional/

to look into:
keras flatten layer, permute layer, reshape layer, repeat vector

[pure size and price, potentially position also]

take l2 states as grid, feed into convolutional 2d, then flatten/reshape, then feed into lstm/gru as input in series

lambada functionality for specialized weighting of values near s/r lines?
https://github.com/keras-team/keras/blob/master/keras/layers/core.py#L564

@timothyyu timothyyu added help wanted Extra attention is needed invalid This doesn't seem right question Further information is requested hypothetical labels May 8, 2018
@timothyyu timothyyu self-assigned this May 8, 2018
@timothyyu
Copy link
Owner Author

timothyyu commented May 29, 2018

Parallelization of input for non-sequential model (conceptual, still testing/under development):
https://github.com/timothyyu/lstm-breaker
1

0

@timothyyu
Copy link
Owner Author

lambada function + attention mechanism potential
https://github.com/philipperemy/keras-attention-mechanism

@timothyyu
Copy link
Owner Author

related to #25, #38, #26, #22

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
help wanted Extra attention is needed hypothetical invalid This doesn't seem right question Further information is requested
Projects
None yet
Development

No branches or pull requests

1 participant