from keras.preprocessing.text import Tokenizer error: AttributeError: module ‘tensorflow.compat.v2‘ has..

Error from keras.preprocessing.text import tokenizer: attributeerror: module ‘tensorflow. Compat. V2’ has

Import the vocabulary mapper tokenizer in keras in NLP code

from keras.preprocessing.text import Tokenizer

Execute code and report error:

AttributeError: module 'tensorflow.compat.v2' has no attribute '__ internal__'

Baidu has been looking for the same error for a long time, but it sees a similar problem. Just change the above code to:

from tensorflow.keras.preprocessing.text import Tokenizer

That’s it!

Read More: