Interesting that you guys were debating how to assign a rank value to hive insects versus humans. One of the things HH brings up is that we can mistake which metrics are the most important; but it also seems to be saying that some metrics may actually be the most relevant, even if we didn't foresee them.
For instance in HH the 'normal' human civilization might have considered a metric of power to be military or even intelligence-gathering. Based on standard criteria even the hive we see in the book couldn't compare to the U.S. in terms of pure might or amount of spies. But the big surprise is that these metrics
seem like the key ones, but actually the metric that mattered most was who would be first to develop a doomsday device. This is likely a deliberate analogy to nuclear war, with the proviso that in this case it wasn't just the doomsday tech that mattered, but the ability to implement it in a completely one-sided and safe way that would devastate the enemy and leave your side unscathed.
Another criterion we see upended in the book is cultural supremacy. We are given a lot of reason to be repulsed by the brutal and cruel (to say nothing of gross) methods of the hive, and based on our cultural and moral understandings we would see ourselves as superior - or perhaps more enlightened - than them. We could feel this way at least to an extent because of our numbers, and our mass media. The hive wasn't a 'competing culture' precisely because it was hidden. And yet the culture wars in this story end up decided not by majority rule, but based on which culture would be first to develop the right weaponry against the other side. And maybe also which culture was better at cloaking itself.
So I don't think it's so wrong to have goalposts that are hard to define here; the theme of upended metrics is core to the reading of the book. I don't see it as wrong to suppose that we may be superior and more sophisticated than hive insects, but what the books says at any rate is that we should be prepared one day to realize we were looking at it all wrong and that another factor was far more relevant. But the factors we assume are relevant
may actually be so. The point is that we'll never quite know we're right, and will probably only be proven wrong when it's too late. It's a fatalistic theme