Semantic memory, the cognitive system devoted to conceptual knowledge, deals with a large quantity of information interlinked in manifold relationships. Everyday experience with healthy adult subjects shows that word search and retrieval processes provide fluent and coherent speech, i.e. are efficient, so either semantic memory encodes, besides thousands of words, different links for different relationships (introducing greater complexity and storage costs), or the structure evolves, through learning and experiential reinforcement of already existing knowledge, to allow both storage economy and fast retrieval. In this article we explore how a network representation of semantic memory facilitates the extraction of categorical relations between pairs of words, claiming that this efficient mechanism is a consequence of long-term navigation upon the semantic topology. To this end, we first present and characterize an empirical data set modeled as a network, then we simulate a naïve cognitive navigation on top of the topology. We schematize this latter process as uncorrelated random walks from node to node, which converge to a feature vectors network. The strength of categorical relationships (pairwise semantic similarity) is measured as the overlap between vectors of words.