Install GeoAnalytics Engine on Microsoft Fabric

Microsoft Fabric is an all-in-one analytics solution. It offers a comprehensive suite of services including data lake, data engineering, and data integration, all in one place. Using the steps outlined below, GeoAnalytics Engine can be leveraged within a Microsoft Fabric notebook.

The table below summarizes the Microsoft Fabric runtimes supported by each version of GeoAnalytics Engine.

GeoAnalytics EngineMicrosoft Fabric Runtime
1.4.x1.1 (Spark 3.3, Delta 2.2), 1.2 (Spark 3.4, Delta 2.4)

To complete this install guide, you will need:

  • Access to Microsoft Fabric. If you do not have access to Microsoft Fabric, get access by starting a free trial.
  • GeoAnalytics Engine install files. If you have a GeoAnalytics Engine subscription with a username and password, you can download the ArcGIS GeoAnalytics Engine distribution here after signing in. If you have a license file, follow the instructions provided with your license file to download the GeoAnalytics Engine distribution.

  • A GeoAnalytics Engine subscription, or a license file.

Prepare the workspace

  1. Log in to Microsoft Fabric.

  2. From the Fabric homepage, find and open the Data Engineering experience.

  3. If you do not have an existing Fabric environment for GeoAnalytics Engine, create one.

  4. Under the Custom libraries section of the environment, select the upload option and upload the GeoAnalytics Engine .whl file.

  5. Under the compute section of the environment, select one of the Fabric runtimes that the GeoAnalytics Engine version supports.

  6. Make any additional changes to the environment as needed, then save and publish the changes.

  7. Return to the Data Engineering experience homepage and create a lakehouse or use an existing lakehouse.

  8. Within the lakehouse, click on the files folder, select the upload option and upload the GeoAnalytics Engine .jar file. Depending on the analysis you will complete, optionally upload the following jars:

    • esri-projection-geographic if you need to perform a transformation that requires supplementary projection data.
    • geoanalytics-natives to use geocoding or network analysis tools.

Prepare the Fabric notebook

  1. Create a new notebook or open an existing one. Choose “PySpark (Python)” as the primary language.

  2. In the Fabric notebook, select the environment that has the GeoAnalytics Engine .whl file.

  3. Attach a lakehouse to the Fabric notebook.

  4. Add a new cell to the notebook and paste in the Spark session configuration magic command below after modifying the "jars" section to include the ABFS lakehouse path(s) to the GeoAnalytics .jar file(s) needed for the analysis.

    Use dark colors for code blocksCopy
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    %%configure
    {
        "jars" : [
            "abfss://.../Files/geoanalytics.jar"
            ],
        "conf":
        {
            "spark.plugins": "com.esri.geoanalytics.Plugin",
            "spark.serializer": "org.apache.spark.serializer.KryoSerializer",
            "spark.kryo.registrator": "com.esri.geoanalytics.KryoRegistrator",
        }
    }

    Select Run on the cell. This will load the specified GeoAnalytics .jar file(s) and start the Spark session.

Authorize GeoAnalytics Engine

  1. Import the geoanalytics library and authorize it using your username and password or a license file. See Authorization for more information. For example:

    Use dark colors for code blocksCopy
    1
    2
    import geoanalytics
    geoanalytics.auth(username="User1", password="p@ssw0rd")
  2. Try out the API by importing the SQL functions as an easy-to-use alias like ST and listing the first 20 functions in a notebook cell:

    Use dark colors for code blocksCopy
    1
    2
    from geoanalytics.sql import functions as ST
    spark.sql("show user functions like 'ST_*'").show()

What’s next?

You can now use any SQL function, track function, or analysis tool in the geoanalytics module.

See Data sources and Using DataFrames to learn more about how to access your data from your notebook. Also see Visualize results to get started with viewing your data on a map. For examples of what else is possible with GeoAnalytics Engine, check out the sample notebooks, tutorials, and blog posts.

Your browser is no longer supported. Please upgrade your browser for the best experience. See our browser deprecation post for more details.