为什么建立embedding_matrix时要再一开始多加一行?
why : nb_words = len(tokenizer.word_index) + 1 ?????answer:1. wordindex start from 1, so the index of 0 would be 0 all the timeembedding_matrix = np.zeros((nb_words, embedding_dim))for wo...