FIX: Reduce input of to_tsvector to follow limits (PR #13806)

Long posts may have cooked fields that produce tsvectors longer than the maximum size of 1MiB (1,048,576 bytes). This commit uses just the first million characters of the scrubbed cooked text for indexing.

Reducing the size to exactly 1MB (1_048_576) is not sufficient because sometimes the output tsvector may be longer than the input and this gives us some breathing room.

GitHub

Shouldn’t we be doing this on all the weights, somewhere around here instead?

I looked at all update_index calls and none of the other ones pass strings that can be very long. If you want, I can do it there.