You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently, the learnt models require the nwords parameter, along with the main words input, when serving the models with tensorflow server.
It would be very helpful if this requirement of nwords parameter can be omitted, so that only the words array can be streamed directly from database sources to the tensorflow server running the learnt models.
The text was updated successfully, but these errors were encountered:
If you're making batchted predictions, sentences can be of different lengths and thus need to be padded before being fed to tensorflow, and as the CRF is a sequence model, you need the length to be exact, so it won't be possible to omit the length.
It might be possible to slightly modify the code to not use the length at inference time if you're always giving one sentence at a time (batches of size 1) but I would really not recommend it.
Currently, the learnt models require the
nwords
parameter, along with the mainwords
input, when serving the models with tensorflow server.It would be very helpful if this requirement of
nwords
parameter can be omitted, so that only the words array can be streamed directly from database sources to the tensorflow server running the learnt models.The text was updated successfully, but these errors were encountered: