Delivering high-quality context-relevant information in a timely manner is a priority for location-based services (LBS) where applications require an immediate response based on spatial interaction. Previous work in this area typically focused on ever more accurately determining this interaction and informing the user in the customary graphical way using the visual modality. This paper describes the research area of multimodal LBS and focuses on audio as the key delivery mechanism. This new research extends familiar graphical information delivery by introducing a geoservices platform for delivering multimodal content and navigation services. It incorporates a novel auditory user interface (AUI) that enables delivery of natural language directions and rich media content using audio. This unifying concept provides a hands-free modality for navigating a mapped space while simultaneously enjoying rich media content that is dynamically constructed using such mechanisms as algorithmic music and phrase synthesis to generate task-relevant content based on the path taken. This paper outlines the innovative ideas employed in the design of the AUI and details the geoservices platform developed for facilitating the authoring and delivery of multimodal LBS applications. The paper concludes with a discussion on the results of a live user trial. The results are analysed and presented to validate the original hypothesis for this approach, address the research questions outlined and to inform further research directions. The results show that the proposed solution significantly progresses the state-of-the-art in terms of mobile tour production. The results also show that an AUI is an effective modality for the delivery of audio content and natural directions when used in combination with a graphical user interface, producing significantly reduced overheads in terms of content size and network usage. The results also indicate that the AUI provides a good overall user experience, performing well in the user trial.