This sample demonstrates how to import an already existing set of parcels to ArcGIS Urban. The parcels are located in the city of Uppsala, Sweden. The geometry and optionally also attributes of parcels are loaded directly from a geoJSON file. Download the parcels in a geoJSON format used in this sample.
To test the code from this sample, set up your own new urban model based on the USA Default template. This way you have an access to the relevant feature service and can create, update or delete objects.
Import relevant libraries and the Urban API schema stored as a Python module. See the Python sgqlc client library section to get instructions on how to export the schema to a Python module. The geopandas library is used to read parcels geometry from a geoJSON.
from sgqlc.operation import Operation
from sgqlc.endpoint.http import HTTPEndpoint
from urban_api_schema import urban_api_schema as schema
import geopandas as gpd
import re
import math
Provide the endpoint
and token
variables.
Replace the token
variable with your token to authenticate the account which hosts the created model.
You can read more about how to retrieve a token in the Authentication guide.
Set a path to the parcels dataset.
token="ACCESS_TOKEN"
endpoint_headers = {
'X-Esri-Authorization': 'Bearer ' + token,
}
endpoint_url = 'https://urban-api.arcgis.com/graphql'
endpoint = HTTPEndpoint(endpoint_url, endpoint_headers)
dataset_geojson = 'LOCAL_PATH/parcels.geojson'
Use the geopandas to read the parcels to a parcels
variables.
parcels = gpd.read_file(dataset_geojson)
To see the type and amount of data contained in the dataset, print the number of parcels, the names of attributes available in the dataset, and its spatial reference.
print("Number of parcels: {}, number of columns: {}".format(
parcels.shape[0], parcels.shape[1])
)
print("List of the attributes headers: {}".format(list(parcels.columns)))
wkid = int(re.search(r'\d+', str(parcels.crs)).group())
print("Spatial reference: {}".format(wkid))
The output should look something like this:
Number of parcels: 1387, number of columns: 9
List of the attributes headers: ['CustomID', 'Households', 'Jobs', 'Population', 'EdgeInfos', 'Suitabilit', 'Shape_Leng', 'Shape_Area', 'geometry']
Spatial reference: 3857
As you can see, the file contains a subset of 1387 parcels. You will add the parcels in batches of 500 to avoid timeout errors. Note that this requires sending 3 mutations to write all the parcels to the urban model.
Define two helper functions extract
and get
with the following roles:
- The
extract
function extracts the exterior and interior rings of the input polygon. For any polygon, there is always one exterior ring (stored as a single linear ring object) and zero or more interior rings (stored as a list of linear rings). Note that the function only accepts_polygon _coords Polygon
geometry type, notMulti
.Polygon - The
get
function takes a collection of parcels of a specified size and saves it in the list that can later be used in the GraphQL mutation. Each individual parcel stores the geometry in a form of 2d rings, as well as the optional attributes: suitability score, custom ID, and existing values of population, jobs, and households._batch
def extract_polygon_coords(geometry):
if geometry.type == 'Polygon':
# slice the linear ring to extract the coordinates
exterior_coords = geometry.exterior.coords[:]
interior_coords = []
for interior in geometry.interiors:
interior_coords.append(interior.coords[:])
elif geometry.type == 'MultiPolygon':
raise ValueError('There are polygons of type MultiPolygon in your set. Split them to single polygons first.')
else:
raise ValueError('Unhandled geometry type: ' + repr(geometry.type))
# go through the interior rings if there are any in the polygon
if interior_coords:
coords = []
coords.append(exterior_coords)
for element in interior_coords:
coords.append(element)
return coords
else:
return [exterior_coords]
def get_batch(parcels, start, batch_size):
end = start+batch_size
# get a batch of parcels
if end>len(parcels):
parcels_selection = parcels[start:len(parcels)]
else:
parcels_selection = parcels[start:start+batch_size]
# transform the geodataframe to a list of dictionaries
# each row (that is parcel) is shown as a separate dictionary
parcels_transformed = parcels_selection.to_dict('index')
parcels_list = []
for _, attributes_dict in parcels_transformed.items():
# initialize a single parcel object
single_parcel = {
'attributes': {},
'geometry':{'rings':[], 'spatial_reference': {'wkid':wkid}}
}
# set the attributes for the single parcel
attributes = {
'custom_id': attributes_dict['CustomID'],
'households': int(attributes_dict['Households']),
'jobs': int(attributes_dict['Jobs']),
'population': int(attributes_dict['Population']),
}
# extract a geometry of a single parcel (2d rings)
geometry = extract_polygon_coords(attributes_dict['geometry'])
# save single parcel's attributes and geometry to the single parcel object
single_parcel['attributes'] = attributes
single_parcel['geometry']['rings'] = geometry
# add a single parcel to the list
parcels_list.append(single_parcel)
return parcels_list
Make calls to the Urban API endpoint and add parcels to the model.
Remember, you need to replace the urban
variable with the urban database id value of the urban model you created.
It might take a while until all the mutations are successfully sent to the endpoint.
start = 0
batch_size = 500
for loop in range(math.ceil(len(parcels)/batch_size)):
print("Loop number: ", loop, "/", math.ceil(len(parcels)/batch_size)-1)
parcels_list = get_batch(parcels, start, batch_size)
# initialize the mutation
op = Operation(schema.Mutation)
# add parcels to the urban model
create_parcels = op.create_parcels(urban_database_id= "885e8566549d44db93df3184521XXXXX",
parcels=parcels_list)
# select relevant return fields
create_parcels.attributes.__fields__('global_id')
# make a call to the endpoint
json_data = endpoint(op)
errors = json_data.get('errors')
if errors:
print(errors)
start+=batch_size
Go to ArcGIS Online and check if the parcels were added to your urban model.