Whenever I’m asked to explain our catalog enrichment and refinement system, I go back to the example that was its genesis.
As design and aesthetic considerations permeate more and more of what we do, they become more and more of an important way to distinguish one good or service from another. For that reason, it’s unsurprising that we receive catalogs from our clients with labels like “burnt umber” or “light poppy” for the color of, say, a sweater they sell. From a marketing perspective, “imperial violet” has a much better ring than “purple.”
These are certainly useful data to have for the design-driven shoppers who have the design vocabulary and specificity to ask a search system for something like a “light poppy sweater.” But what about more typical shoppers who might want to search with something more generic like “red sweater” or “light red sweater”?
Once upon a time, it might have been necessary to have a team of people comb through the catalog and try to come up with synonyms for “light poppy” or “dark onyx.” A process that’s time-consuming, expensive, and, worse still, not always exhaustive. And certainly, we wouldn’t want to tell our customers to go back and erase the hard work of their marketing and creative teams for the sake of making search by color work well.
Hence my question that helped catalyze the development of this tool: “Couldn’t we look at the product images and figure out what color an item is?”
Many months later, our catalog enrichment system can do this and much more. Our text analysis tools can look for latent meanings, or relate similar concepts to one another. We know coats are lexically adjacent to jackets, pumps are a type or shoe, or that a skirt might be functionally interchangeable with a pair of jeans. Our image analysis tools can, of course, map something’s color to reference values. It also powers search and annotation by allowing us to identify a garment’s taxonomic properties — that a pair of Repetto flats are women’s shoes without a heel, say — and fine-grained latent visual properties that allow us to find items visually similar to a target along axis as precise as the sleeve length.
So our search customers don’t have to go back and revise every single item they sell. Instead, we can have our system figure out that someone looking for a “black top” might want that “silk chemise” in “deepest gray.” Or simply that “gray” and “grey” are the same thing. If you’d like to find out more, get in touch with our team at firstname.lastname@example.org.