As a general rule, AI isn't great at using new info to make better sense of existing info.
The project focused on collecting info through regular language ("in front of me there's a Brooks Brothers"), but it produced an interesting side discovery: the team learned that the bots were more effective when they used a "synthetic" chat made of symbols to communicate data. In other words, the conversations they'd use to help you find your hotel might need to be different than those used to help, say, a self-driving car.
The research also helped Facebook's AI make sense of visually complex urban environments. A Masked Attention for Spatial Convolution system could quickly parse the most relevant keywords in their responses, so they could more accurately convey where they were or needed to go.
As our TechCrunch colleagues observed, this is a research project that could improve AI as a whole rather than the immediate precursor to a navigation product. With that said, it's easy to see practical implications. Self-driving cars could use this to find their way when they can't rely on GPS, or offer directions to wayward humans using only vague descriptions.