First, just FYI, you should be thinking in terms of transactions rather than blocks, since each stream item is contained by a single transaction. There can be a large number of transactions in a single block.
Second, there is no explicit recommendation on this question. But as you say, when there are 1000s of matching JSON objects to be merged together, the getstreamkeysummary API command can become slow. So if this becomes an issue for your application, you could use the second option you mention.
The other possibility is to think about your schema design, and change that design so that there aren't so many items with the same key. What are the fields you are updating for this key? Perhaps there's another way to represent things.