![]() 12:36:10.291070: W tensorflow/compiler/xla/stream_executor/platform/default/dso_:64] Could not load dynamic library 'libnvinfer.so.7' dlerror: libnvinfer.so.7: cannot open shared object file: No such file or directory Setup pip install "tensorflow-text>=2.11" pip install einops import numpy as np Note: This example takes approximately 10 minutes to run. This shows which parts of the input sentence has the model's attention while translating: ![]() The translation quality is reasonable for a toy example, but the generated attention plot is perhaps more interesting. The resulting model is exportable as a tf.saved_model, so it can be used in other TensorFlow environments. Writing custom keras.Models and keras.layersĪfter training the model in this notebook, you will be able to input a Spanish sentence, such as " ¿todavia estan en casa?", and return the English translation: " are you still at home?".This example assumes some knowledge of TensorFlow fundamentals below the level of a Keras layer: While this architecture is somewhat outdated, it is still a very useful project to work through to get a deeper understanding of sequence-to-sequence models and attention mechanisms (before going on to Transformers). This tutorial: An encoder/decoder connected by attention. This tutorial demonstrates how to train a sequence-to-sequence (seq2seq) model for Spanish-to-English translation roughly based on Effective Approaches to Attention-based Neural Machine Translation (Luong et al., 2015).
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |