GeoAnalytics Engine supports geocoding and network analysis tools. To use these tools, you need to setup the required components described below.
The geocoding tools require a locator and the network analysis tools require a network dataset. The locator or network dataset must be locally accessible to all nodes in your Spark cluster. In a cloud environment, you can first upload the locator or network dataset to a file system like Amazon S3 and then mount or copy it to each node's local system. This location in each node's file system needs to have enough disk space to store the locator or network dataset.
Here is an example of how to stage the locator or network dataset in Databricks:
- Upload the locator or network dataset to a cloud file system like Azure Blob Storage.
- Install GeoAnalytics Engine on Databricks.
- On a notebook, mount the locator or network dataset to DBFS using the
dbutils.fs.mount
command. - Update the Cluster-scoped init script to copy files from the mounted location to
/databricks/
.Use dark colors for code blocks Copy cp -r /dbfs/mnt/locators/. /databricks/locators/ cp -r /dbfs/mnt/network_datasets/. /databricks/network_datasets/