- Fix docstring of connector
_mlp_transform
. (#192)
v0.2.2 (2019-08-05)
- Use lazy import to be compatible with texar-pytorch. (#183)
v0.2.1 (2019-07-28)
- Add support for GPT-2 345M model in examples/gpt-2. (#156)
- Add BERT modules, including
texar.modules.BERTEncoder
(doc) andtexar.modules.BERTClassifier
(doc). (#167)
- Refactor
TransformerEncoder
andTransformerDecoder
to separate position embeddings from the modules. (#126) - Allow passing a Tensor to
output_layer
of decoders' constructors -- used for weight tie b/w the output layer and input embedding matrix. (#126) TransformerDecoder
constructor interface made exact the same withRNN decoders
constructor interfaces. (#126)- Refactor decoder
Helper
s to allow two-argumentembedding_fn
(supporting for position embedding). (#126) - Refactor
SinusoidsPositionEmbedder
to enable infinite large or negative position indexes. (#176)
- Fix
texar.losses.reduce_batch_time
whensequence
has dtype other thantf.float32
. (#143) - Fix
texar.losses.reduce_dimensions
whenaverage_axes
orsum_axes
isint
. (#141) - Fix GPT-2 tokenization loading path. (#165)
- Fix examples/vae_text EOS bug. (#168)
- Fix transformer bleu_tool.py when
translation_length
is 0. (#176) - Fix
StochasticConnector
andReparameterizedStochasticConnector
whentransform=False
. (#179)
v0.2.0 (2019-04-09)
TFRecordData
: A new data module for reading and processing TFRecord data, with support for, e.g., image data, feature data, etc. (#107)GPT-2
: OpenAI pretrained language model. (#91, example)TopKSampleEmbeddingHelper
to perform top_k random sample decoding. (baa09ff)
- Refactor
BERT
example usingTFRecordData
data module. TransformerDecoder
supportshelper
arguments to specify decoding strategy. (#76)
- Fix variable collection bug in
examples/seqgan
. (#110) - Fix error when
beam_search_decode
withoutput_layer=tf.identity
(#77) - Fix readthedocs compilation error (#85)