Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch. Based on code of https://github.com/karpathy/char-rnn. Support Chinese and other things.
-
Updated
Oct 19, 2016 - Lua
Multi-layer Recurrent Neural Networks (LSTM, GRU, RNN) for character-level language models in Torch. Based on code of https://github.com/karpathy/char-rnn. Support Chinese and other things.
Code accompanying Incorporating Chinese Characters of Words for Lexical Sememe Prediction (ACL2018) https://arxiv.org/abs/1806.06349
Multi lingual character based named entity recognizer
Implementation of the Character-level Intra Attention Network (CIAN) for Natural Language Inference (NLI) upon SNLI and MultiNLI corpus
In this project, I worked with a small corpus consisting of simple sentences. I tokenized the words using n-grams from the NLTK library and performed word-level and character-level one-hot encoding. Additionally, I utilized the Keras Tokenizer to tokenize the sentences and implemented word embedding using the Embedding layer. For sentiment analysis
Explore AI-powered text generation with a character-level transformer model that mimics Shakespeare’s style.
Character-level fork of Fairseq for sequence-to-sequence learning
An implementation of character level text generation with LSTM.
On Anonymous Commenting: A Greedy Approach to Balance Utilization and Anonymity for Instagram Users - Accepted at SIGIR 2019
Add a description, image, and links to the character-level topic page so that developers can more easily learn about it.
To associate your repository with the character-level topic, visit your repo's landing page and select "manage topics."