The character limit for keywords is 8000. The limit for text is 32766 bytes. Elasticsearch is a wrapper around a technology called Apache Lucene. Lucene's byte-length limit is 32766. Hitting that limit would be pretty hard with text since the input string gets split up by the tokenizer. There is input that could theoretically hit that (tokenizer splits into one giant text to index) but I don't think that is a concern.
Text vs keyword index mapping controls how the data structure holds the data and how it is made searchable. The main difference is that text is tokenized and split up while keyword stays whole.
Example of text vs keyword mapping using the search term: “Simple Search”.
With “Simple Search” as text, a user could search any individual word and find the results. Using “Simple” or “Search” would return results. To find the results as a keyword attribute, the whole search term “Simple Search” would be required.