Investigating The Distribution of Crime by Type

Geo-Vis Project Assignment, TMU Geography, SA8905, Fall 2025


Hello everyone, and welcome to my blog!

Today’s topic addresses the distribution of crime in Toronto. I am seeking to provide the public, and implicated stakeholders with a greater knowledge and understanding of how, where, and why different types of crime are distributed in relation to urban features like commercial buildings, public transit, restaurants, parks, open spaces, and more. We will also be looking at some of the socio-economic indicators of crime, and from there identify ways to implement relevant and context specific crime mitigation and reduction strategies.

This project investigates how crime data analysis can better inform urban planning and the distribution of social services in Toronto, Ontario. Research across diverse global contexts highlights that crime is shaped by a mix of socioeconomic, environmental, and spatial factors, and that evidence-based planning can reduce harm while improving community well-being. The following review synthesizes findings from six key studies, alongside observed crime patterns within Toronto.


Accompanying a literature review, I created a 3D model that displays a range of information including maps made in ArcGIS Pro. The data used was sourced from the Toronto Police Service Public Safety Data Portal, and Toronto’s Neighbourhood Profiles from the 2021 Census. The objective here is to draw insightful conclusions as to what types of crime are clustering where in Toronto, what socio-economic and/or urban infrastructural indicators are contributing to this? and what solutions could be implemented in order to reduce overall crime rates across all of Toronto’s neighbourhoods – keeping equitability in mind ?

The distribution of crime across Toronto’s neighbourhoods reflects a complex interplay of socioeconomic conditions, built environment characteristics, mobility patterns, and levels of community cohesion. Understanding these geographic and social patterns is essential to informing more effective city planning, targeted service delivery, and preventive interventions. Existing research emphasizes the need for long-term, multi-approach strategies that address both immediate safety concerns and the deeper structural inequities that shape crime outcomes. Mansourihanis et al. (2024) highlight that crime is closely linked to urban deprivation, noting that inequitable access to resources and persistent neighbourhood disadvantages influence where and how crime occurs. Their work stresses the importance of integrating crime prevention with broader social and economic development initiatives to create safer, and more resilient urban environments (Mansourihanis et al., 2024).

Mansourihanis, O., Mohammad Javad, M. T., Sheikhfarshi, S., Mohseni, F., & Seyedebrahimi, E. (2024). Addressing Urban Management Challenges for Sustainable Development: Analyzing the Impact of Neighborhood Deprivation on Crime Distribution in Chicago. Societies, 14(8), 139. https://doi.org/10.3390/soc14080139

Click here to view the literature review I conducted on this topic.


Methods – Creating a 3D Interactive Crime Investigation Board

The purpose of this 3D map is to provide an interactive tool that can be regularly updated over time; allowing users to build upon research using various sources of information in varying formats (e.g. literature, images, news reports, raw data, various map types presenting comparable socio-economic data, etc; thread can be used to connect images and other information to associated areas on the map). The model has been designed for easy means of addition, removal and connection of media items by using materials like tacks, clips, and cork board. Crime incidents can be tracked and recorded in real time. This allows for quick identification of where crime is clustering based on geography, socio-economic context, and proximity to different land use types and urban features like transportation networks. We can continue to record and analyze what urban features or amenities could be deterring or attracting/ promoting criminal activity. This will allow for fast, context specific, crime management solutions that will ultimately help reduce overall crime rates in the city.

1. Conduct a detailed literature review. 
Here is the literature review I conducted to address this topic.

2. Downloaded the following data from: Open Data | Toronto Police Service Public Safety Data Portal. Each dataset was filtered to show points only from 2025.

- Dataset: Shooting and Firearm Discharges
- Dataset: Homicides
- Dataset: Assault
- Dataset: Auto Theft
- Dataset: Break and Enter

Toronto Neighbourhood Profiles, 2021 Census from: Neighbourhood Profiles - City of Toronto Open Data Portal
- Average Total Household Income by Neighbourhood
- Unemployment Rates by Neighbourhood

3. After examining the full data sets by year, select a time period to map. In this case, July 2025 which was the month that had the greatest number of crimes to occur this year.

4. Map Setup
- Coordinate system: NAD 1983 UTM Zone 17N
- Rotation: -17
- Geography:
- City of Toronto, ON, Canada
- Neighbourhood boundaries from Toronto Open Data Portal

5. Add the crime incident data reports and Toronto’s Neighbourhood Boundary file.

Geospatial Analysis Tools Used
Tool - Select by attribute and delete the data that we are not mapping. In this case;
From the Attribute Table,
Select by Attribute [OCC_YEAR] [is less than] [2025]

Tool - Summarize within
Count the number of crime incidents within each of the neighbourhood's boundary polygons for the 5 selected crime types for preliminary analysis and mapping.

Design Tools and Map Types Used
- Dot Density
- 2025 Crime rates, by type, annual and for July of 2025
- Heat Map
- 2025 Crime rates, by type, annual and for July of 2025
- Choropleth
- Average Total Household Income, City of Toronto by Neighbourhood
- Unemployment Rates Across Toronto, 2021
- Design Tools e.g. convert to graphics
Based on literature review and analysis of the presented maps,  this model allows for us to further analyze, visually display and record the data and findings. This model will allow for users to see where points are clustering, and examine urban features, land use and the socio-economic context of cluster areas in order to address potential solutions, with equity in mind.

Supplies
- Thread,
- Painted tooth picks,
- Mini clothes pins,
- Highlighters, markers etc.
- Scissors,
- Hot glue
- Images of indicators
- Relevant/insightful literature research
- Socio-Economic Maps: Population Income, unemployment, and density
- Crime Maps: Dot density crime by type, heat map of crime distribution by type, from the select 5 crime types, all incidents to occur during the month of July, 2025

Process
1. Attach cork board to poster board;

2. Cut out and place down main maps that have been printed (maps created in ArcGIS Pro, some additional design edits made in Canva);

3. Outline the large or central base map with tacks; use string to connect the tacks outlining the City of Toronto's regional boundary line.

4. Using colour painted tooth picks (alternatively, tacks may be used depending on size limitations), crime incidents can be recorded in real time, using different colours to represent different crime types.

5. Additional data can be added on and joined to other map elements over time. This data could be: images and locations of crime indicators; new literature findings; news reports’ raw data; different map types presenting comparable socio-economic data; community input via email, from consultation meetings, 911 calls, or surveys; graphs; tables; land use type and features and more.

6. Thread is used to connect images and other information to associated areas on the map. In this case, blue string and tacks were used to highlight preventative crime measures and red to represent an indicator of crime.

7. Sticky notes can be used to update the day and month (using a new poster/cork board for each year), under “Time Stamp”

8. Use of Google Earth was applied to further analyze using satellite imagery, a terrestrial layer, and an urban features layer in order to further analyze land use, type, function, and significant features like Union Station - a major public transit connection point, and located within Toronto’s most dense and overall largest crime hot spot.

9. A satellite imagery base map in ArcGIS was used to compare large green spaces (parks, ravines, golf courses etc.) with the distribution of each incidence point on the dot map created. Select each point field individually for optimal view and map analysis.

10. Video and Photo content used to display the final results were created using an IPhone Camera and the "iMovie" video editing app.

See photos and videos for reference!

Socioeconomic and Environmental Indicators of Crime

A consistent theme across the literature and my own findings is the strong connection between neighborhood deprivation and crime. Mansourihanis et al. (2024) emphasize that understanding the “relationship between urban deprivation and crime patterns” supports targeted, long-term strategies for urban safety. Concentrated poverty, population density, and low social cohesion are significant predictors of violence (Mejia & Romero, 2025; M. C. Kondo et al., 2018). Similarly, poverty and weak rule of law correlate more strongly with homicide rates than gun laws alone (Menezes & Kavita, 2025).

Environmental characteristics also influence crime distribution. Multiple studies link greater green space to reduced crime, higher social cohesion, and stronger perceptions of safety (Mejia & Romero, 2025). Exposure to green infrastructure can foster community pride and engagement, further reinforcing crime-preventive effects (Mejia & Romero, 2025). Relatedly, Stalker et al. (2020) show that community violence contributes to poor mental and physical health, with feelings of unsafety directly associated with decreased physical activity and weaker social connectedness.

Other urban form indicators—including land-use mix, connectivity, and residential density—shape mobility patterns that, in turn, affect where crime occurs. Liu, Zhao, and Wang (2025) find that property crimes concentrate in dense commercial districts and transit hubs, while violent crimes occur more often in crowded tourist areas. These patterns reflect the role of population mobility, economic activity, and social network complexity in structuring urban crime.

Crime Prevention and Community-Based Solutions

Several authors highlight the value of integrating built-environment design, green spaces, and community-driven interventions. Baran et al. (2014) show that larger parks, active recreation features, sidewalks, and intersection density all promote park use, while crime, poverty, and disorder decrease utilization. Parks and walkable environments also support psychological health and encourage social interactions that strengthen community safety. In addition, green micro-initiatives—such as community gardens or small landscaped interventions—have been found to enhance residents’ emotional connection to their neighborhoods while reducing local crime (Mejia & Romero, 2025).

At the policy level, optimizing the distribution of public facilities and tailoring safety interventions to local conditions are essential for sustainable crime prevention (Liu, Zhao, & Wang, 2025). For gun violence specifically, trauma-informed mental health care, early childhood interventions, and focused deterrence are recommended as multidimensional responses (Menezes & Kavita, 2025).

Spatial Crime Patterns in Toronto

When mapped across Toronto’s geography, the crime data revealed distinct clustering patterns that mirror many of the relationships described in the literature. Assault, shootings, and homicides form a broad U- or O-shaped distribution that aligns with neighborhoods exhibiting lower average incomes and higher unemployment rates. These patterns echo global findings on deprivation and violence.

Downtown Toronto—particularly the area surrounding Union Station—emerges as the city’s highest-density crime hotspot. This zone features extremely high connectivity, car-centric infrastructure, dense commercial and mixed land use, and limited green space. These conditions resemble those identified by Liu, Zhao, and Wang (2025), where transit hubs and high-traffic commercial districts generate elevated rates of property and violent crime. Google Earth imagery further highlights the concentration of major built-form features that attract large daily populations and mobility flows, reinforcing the clustering of assaults and break-and-enter incidents in the downtown core.

Auto theft is relatively evenly distributed across the city and shows weaker clustering around transit or commercial nodes. However, areas with lower incomes and higher unemployment still show modestly higher auto-theft levels. Break and enter incidents, by contrast, concentrate more strongly in high-income neighborhoods with lower unemployment—suggesting that offenders selectively target areas with greater material assets.

Across all crime categories, one consistent pattern is the notable absence of incidents within large green spaces such as High Park and Rouge National Urban Park. This supports the broader literature connecting green space with lower crime and improved perceptions of safety (Mejia & Romero, 2025; Baran et al., 2014). Furthermore, as described, different kinds of crime occur in low versus high income neighbourhoods emphasizing a need for context specific resolutions that take into consideration crime type and socio-economics.

Synthesis and Relevance for Toronto

Collectively, these findings indicate that crime in Toronto is shaped by intersecting socioeconomic factors, environmental features, and mobility patterns. Downtown crime clustering reflects high density, transit connectivity, and land-use complexity; outer-neighborhood violence aligns with deprivation; and green spaces consistently correspond with lower crime. These patterns mirror global research emphasizing the role of social cohesion, urban form, and economic inequality in shaping crime distribution.

Understanding these relationships is essential for planning decisions around green infrastructure investments, targeted social services, transit-area safety strategies, and neighborhood-specific interventions. Ultimately, integrating environmental design, socioeconomic supports, and community-based programs that support safer, healthier, and more equitable outcomes for Toronto residents.

Mapping and Printing Toronto’s Socioeconomic Status Index in 3D

Menusan Anantharajah, Geovis Project Assignment, TMU Geography, SA8905, Fall 2025

Hello, this is my blog post!

My Geovis project will explore the realms of 3D mapping and printing through a multi-stage process that utilizes various tools. I have always had a small interest in 3D modelling and printing, so I selected this medium for the project. Although this is my first attempt, I was quite pleased with the process and the results.

I decided to map out a simplified Socioeconomic Status (SES) Index of Toronto’s neighbourhoods in 2021 using the following three variables:

  • Median household income
  • Percentage of population with a university degree
  • Employment rate

It should be noted that since these variables exist on different scales, they were standardized using z-scores and then scaled to a 0-100 range. The neighbourhoods will be extruded by the SES index value, meaning that neighbourhoods scoring high will be taller in height. I chose SES as my variable of choice since it would be interesting to physically visualize the disparities and differences between the neighbourhoods by height.

Data Sources

Software

A variety of tools were used for this project, including:

  • Excel (calculating the SES index and formatting the table for spatial analysis)
  • ArcGIS Pro (spatially joining the neighbourhood shapefile with the SES table)
  • shp2stl* (takes the spatially joined shapefile and converts it to a 3D model)
  • Blender (used to add other elements such as title, north arrow, legend, etc.)
  • Microsoft 3D Builder** (cleaning and fixing the 3D model)
  • Ultimaker Cura (preparing the model for printing)

* shp2stl would require an older node.js installation
** Microsoft 3D Builder is discontinued, though you can sideload it

Process

Step 1: Calculate the SES index values from the Neighbourhood Profiles

The three SES variables (median household income, percentage of population with a university degree, employment rate) were extracted from the Neighbourhood Profiles table. Using Microsoft Excel, these variables were standardized using z-scores, then combined into a single average score, and finally rescaled to a 0-100 range. I then prepared the final table for use in ArcGIS Pro, which included the identifiers (neighbourhood names) with their corresponding SES values. After this was done, the table was exported as a .csv file and brought over to ArcGIS Pro.

Step 2: Create the Spatially Joined Shapefile using ArcGIS Pro

The neighbourhood boundary file and the newly created SES table were imported into ArcGIS Pro. Using the Add Join feature, the two data sets were combined into one unified shapefile, which was then exported as a .shp file.

The figure above shows what the SES map looks like in a two-dimensional view. The areas with lighter hues represent neighbourhoods with low SES values, while the ones in dark green represent neighbourhoods with high SES values.

Step 3: Convert the shapefile into a 3D model file using shp2stl

Before using shp2stl, make sure that you have an older version of node.js (v11.15.0) and npm (6.7.0) installed. I would also recommend placing your shapefile in a new directory, as it can later be utilized as a Node project folder. Once the shapefile is placed in a new folder, you can open the folder in Windows Terminal (or Command Prompt) and run the following:

npm install shp2stl

This will bring in all the necessary modules into the project folder. After that, the script can be written. I created the following script:

const fs = require('fs');
const shp2stl = require('shp2stl');

shp2stl.shp2stl('TO_SES.shp', {
  width: 150,
  height: 25,
  extraBaseHeight: 3,
  extrudeBy: "SES_z",
  binary: true,
  verbose: true
}, function(err, stl) {
  if (err) throw err;
  fs.writeFileSync('TO_NH_SES.stl', stl);
});

This script was ‘compiled’ using Visual Studio Code; however, you can use any compiler or processor (even Notepad works). This script was then saved to a .js file in the project folder. The script was then executed in Terminal using this:

node shapefile_convert.js

The result is a 3D model that looks like this:

Since we only have Toronto’s neighbourhoods, we have to import this into Blender and create the other elements.

Step 4: Add the Title, Legend, North Arrow and Scale Bar in Blender

The 3D model was brought into Blender, where the other map elements were created and added alongside the core model. To create the scale bar for the map, the 3D model was overlaid onto a 2D map that already contained a scale bar, as shown in the following image.

After creating the necessary elements, the model needs to be cleaned for printing.

Step 5: Cleaning the model using Microsoft 3D Builder

When importing the model into 3D Builder, you may encounter this:

Once you click to repair, the program should be able to fix various mesh errors like non-manifold edges, inverted faces or holes.

After running the repair tool, the model can be brought into Ultimaker Cura.

Step 6: Preparing the model for printing

The model was imported into Ultimaker Cura to determine the optimal printing settings. As I had to send this model to my local library to print, this step was crucial to see how the changes in the print settings (layer height, infill density, support structures) could impact the print time and quality. As the library had an 8-hour print limit, I had to ensure that the model was able to be printed out within that time limit.

With this tool, I was able to determine the best print settings (0.1 mm fine resolution, 10% infill density).

With everything finalized from my side, I sent the model over to be printed at the library; this was the result:

Overall, the print of the model was mostly successful. Most of the elements were printed out cleanly and as intended. However, the 3D text could not be printed with the same clarity, so I decided to print out the textual elements on paper and layer them on top of the 3D forms.

The following is the final resulting product:

Limitations

While I am still satisfied with the end result, there were some limitations to the model. The model still required further modifications and cleaning before printing; this was handled by the library staff at Burnhamthorpe and Central Library in Mississauga (huge shoutout to them). The text elements were also messy, which was expected given the size and width of the typeface used. One improvement to the model would be to print the elements separately and at a larger scale; this would ensure that each part is printed more clearly.

Closing Thoughts

This project was a great learning experience, especially for someone who had never tried 3D modelling and printing before. It was also interesting to see the 3D map highlighting the disparities between neighbourhoods; some neighbourhoods with high SES index values were literally towering over the disadvantaged bordering neighbourhoods. Although this project began as an experimental and exploratory endeavour, the process of 3D mapping revealed another dimension of data visualization.

References

City of Toronto. (2025). Neighbourhoods [Data set]. City of Toronto Open Data Portal. https://open.toronto.ca/dataset/neighbourhoods/ 

City of Toronto. (2023). Neighbourhood profiles [Data set]. City of Toronto Open Data Portal. https://open.toronto.ca/dataset/neighbourhood-profiles/

Mapping Toronto’s Post-War Urban Sprawl & Infill Growth (1945-2021)

A Geovizualization Project by Mandeep Rainal.

SA8905 – Master of Spatial Analysis, Toronto Metropolitan University.

For this project, I explore how Toronto has grown and intensified over time, by creating a 3D animated geovisualization using Kepler.gl. I will be using 3D building massing data from the City of Toronto and construction period information from the 2021 Census data (CHASS).

Instead of showing a static before and after map, I decided to build a 3D animated geovizualization that reveals how the city “fills in” over time showing the early suburban expansion, mid-era infill, and rapid post-2000 intensification.

To do this, I combined the following:

  • Toronto’s 3D Massing Building Footprints
  • Age-Class construction era categories
  • A Custom “Built-Year” proxy
  • A timeline animation created in Kepler. gl and Microsoft Windows.

The result is a dynamic sequence showing how Toronto physically grew upward and outward.

BACKGROUND

Toronto is Canada’s largest and fastest growing city. Understanding where and when the built environment expanded helps explain patterns of suburbanization, identify older and newer development areas and see infill and intensification. This also helps contextualize shifts in density and planning priorities for future development.

Although building-level construction years are not publicly available, the City of Toronto provides detailed 3D massing geometry, and Statistics Canada provides construction periods at the census tract level for private dwellings.

By combining these sources into a single animated geovizualization, we can vizualize Toronto’s physical growth pattern over 75 years.

DATA

  • City of Toronto – 3D Building Massing (Polygon Data)
    1. Height attributes (average height)
    2. Building Footprints
    3. Used for 3D extrusions
  • City of Toronto – Muncipal Boundary (Polygon Data)
    1. Used to isolate from the Census metropolitan area to the Toronto city core.
  • 2021 Census Tract Boundary
  • CHASS (2021 Census) – Construction Periods for Dwellings
    1. Total dwellings
    2. 1960 and before
    3. 1961-1980
    4. 1981-1990
    5. 1991-2010
    6. 2011-2015
    7. 2016-2021
    8. Used to assign Age classes and a generalized “BuiltYear” for each building.

METHODOLOGY

Step 1: Cleaning and Preparing the Data in ArcGIS Pro

  • I first imported the collected data into ArcGIS. I clipped the census tract layers to the City of Toronto boundary to get census tracts for Toronto only.
  • Next, I joined the census tract polygon layer we created to the construction period data that was imported. This gives us census tracts with construction period counts.
  • Because Toronto does not have building-year data, I assigned construction era categories from the census as proxies for building age, and created an age classification system using proportions. Adding periods and dividing / total dwellings to get proportions, and assigned them into three classes:
    • Mostly Pre-1981 dwellings
    • Mixed-era dwellings
    • Mostly 2000+dwellings
  • Next, I needed a numeric date field for Kepler to activate the time field. I assigned a representative year to each tract using the Age classes.
    • if age = Mostly Pre-1981 dwellings = 1945
    • if age = Mixed-era dwellings = 1990
    • if age = Mostly 2000+dwellings = 2010
  • And to make the built year Kepler-compatible a new date field was created to format as 1945-01-01.
  • The data was then exported as GeoJSON files to import into Kepler.gl. The built year data was also exported as a CSV because Kepler doesn’t pick up on the time field in geoJSON easily.

Stage 2: Visualizing the Growth in Kepler

  • Once the layers are loaded into Kepler the tool allows you manipulate and vizualize different attributes quickly.
  • First the 3D Massing GeoJSON was set to show height extrusion based on the average height field. The colour of the layer was muted and set to be based on the age classes and dwelling eras of the buildings.
  • Second layer, was a point layer also based on the age-classes. This would fill in the 3D massings as the time slider progressed, and was based on brighter colours.
  • The Built Date CSV was added as a time-based filter for the build date field.

The final visualization was screen recorded and shows an animation of Toronto being built from 1945 to 2021.

  • Teal = Mixed-era dwellings
  • Amber = Mostly 2000+ dwellings
  • Dark purple = Mostly Pre-1981 dwellings

RESULTS

The animation reveals key patterns on development in the city.

  • Pre-1981 areas dominate older neighbourhoods, the purple shaded areas show Old Toronto, Riverdale, Highpark, North York.
  • Mixed-era dwellings appear in more transitional suburbs, filling in teal, and showing subdividisions with infill.
  • Mostly 2000+ dwellings are filling in amber and highlight the rapid intensification in areas like downtown with high-rise booms, North York centre, Scarborough Town Centre.

The animation shows suburban sprawl expanding outward, before the vertical intensification era begins around the year 2000.

Because Kepler.gl cannot export 3D time-slider animations as standalone HTML files, I captured the final visualization using Microsoft Windows screen recording instead.

LIMITATIONS

This visualization used census tract–level construction-period data as a proxy for building age, which means the timing of development is generalized rather than precise. I had to collapse the CHASS construction ranges into age classes because the census periods span multiple decades and cannot be animated in Kepler.gl’s time slider, which only accepts a single built-year value per feature. Because all buildings within a tract inherit the same age class, fine-grained variation is lost and the results are affected by aggregation. Census construction categories are broad, and assigning a single representative “built year” further simplifies patterns. The Kepler animation therefore illustrates symbolic patterns of sprawl, infill, and intensification, not exact chronological construction patterns.

CONCLUSION

This project demonstrates how multiple datasets can be combined to produce a compelling 3D time-based visualization of a city’s growth. By integrating ArcGIS Pro preprocessing with Kepler’s dynamic visualization tools, I was able to:

  • Simplify census construction-era data
  • Generate meaningful urban age classes
  • Create temporal building representations
  • Visualize 75+ years of urban development in a single interactive tool

Kepler’s time slider adds an intuitive, animated story of how Toronto grew, revealing patterns of change that static maps cannot communicate.

Geospatial Visualization of Runner Health Data: Toronto Waterfront Marathon

Geovis Project Assignment, TMU Geography, SA8905, Fall 2025

Hello everyone! I am excited to share my running geovisualization blog with you all. This blog will allow you to transform the way you use GPS data from your phone or smart watch!

This idea came to me as I recorded my half marathon run on my apple watch in 2023 in the app “Strava”. Since then, I developed an interest in health tracking data and when assigned this project, I thought, hmm maybe I can make this data my own.

As a result, I explored the options and was able to create a 3D representation of my run and how I was doing physically throughout.

Here is a Youtube link to the final product!

The steps are as followed if you want to give this type of geospatial analysis a try yourself!

Step 1.

You will need to have installed the app Strava. This health and fitness app will track your GPS data from either your phone or watch and track your speed, elevation and heartrate (watch only). Apart from this, you will also need the app RunGap. This app will allow you to transfer your activity data and export it to a “.fit” file. A .fit file is a special data source that can track heartrate, speed and elevation that is geolocated by x and y coordinates every second (each row).

Step 2.

Once you have the apps downloaded, start a health activity on the Strava app. From there you can transfer your Strava data to RunGap.

After you sign in and import the Strava data, go to the activity you want to export as a .fit file. Save the .fit file and transfer it to your computer.

Step 3.

Now that you have the .fit file, you will need to download a tool to convert it to a CSV. This can be found at https://developer.garmin.com/fit/overview/. In Step 1 of this page you will need to download the https://developer.garmin.com/fit/download/ Fit SDK. The file will be in your downloaded folder under FitSDKRelease_21.171.00.zip. You will need to unzip this file and navigate to >java>FitToCSV.bat. This is the tool that you will use on the .fit file. To do this, go to your .fit file properties and change the “Open with:” application to your >java>FitToCSV.bat path.

Now simply run the .fit file and the tool will open and covert it to a CSV in the same folder after pressing any key to continue…

Step 4.

Now, open your CSV. The data is initially messy, and the fields are mixed. To clean it I added a new sheet, and then deleted from the original, continuing to narrow it down using the filter function. In the end, you only want the “data” rows in the Type column and rows with lat and long coordinates to create a point feature class. I also renamed the fields. For example, value 1 became Timestamp(s), which is used as the primary key to differentiate the rows. To get the coordinates in degrees, I used this calculation:

  • Lat_Degrees: Lat_semicircles / 11930464.71
  • Long_Degrees: Long_semicircles / 11930464.71

Furthermore, to display the points as lines in the final map, 4 more fields are needed to be added to the excel sheet. This is the start lat, start long, end lat and end long fields. These can simply be calculated by taking the coordinates of the next row for the end lat and end long. You will also need to do this with altitude to make a 3D representation of the elevation.

Step 5.

Now that your CSV is cleaned, it is ready to be exported as spatial data. Open ArcGIS Pro and create a new project. From here, load your CSV into a new map. This table will be used in the XY to line geoprocessing tool using the start and end coordinates for the WGS_1984_UTM_Zone_17N projection in Toronto.

Once you run the tool, your data should look something like this, displaying lines connecting each initial point/row.

Step 6.

Now it is time to bring your route to life! Start by running the Feature To 3D By Attribute geoprocessing tool on your feature class using the height field as your elevation/altitude.

Your line should now be 3D when opening a 3D Map Scene and importing the 3D layer

Step 7.

To add more dimensions to the symbology colours, I used “Bivariate Colours”. This provides a unique combination of speed and heart rate at each leg of the race.

To make the elevation more visually appealing, I used the extrusion function on the line feature class. Then, I used the “Min Height” category with the expression “$feature.Altitude__m_ / 3”. To further add realism, I added the ground elevation surfaces layer called WorldElevation3D/Terrain3D, so that the surrounding topography pops out more.

Step 8.

Now that the layer and symbology are refined, the final part of the visualization is creating a Birdseye view of the race trail from start to finish. To do this, I once again used ArcGIS Pro and added an animation in the view tab. From here I continuously added key frames throughout the path until the end. Finally, I directly exported the video to my computer.

Step 9. Canva

To conclude, I used Canva to add the legend to the map, add music, and a nice-looking title.

And now, you have a 3D running animation…! I hope you have learned something from this blog and give it a try yourself. It was very satisfying taking a real-life achievement and converting it to a in-depth geospatial representation. :)

Parks and its association to Average Total Alcohol Expenditure (Alcohol in Parks Program in Toronto, ON)

Welcome to my Geovisualization Assignment!

Author: Gabyrel Calayan

Geovisualization Project Assignment

TMU Geography

SA8905 – FALL 2025

Today, we are going to be looking at Parks and Recreation Facilities and its possible association to average alcohol expenditure in census tracts (due to the Alcohol in Parks Program in the City of Toronto) using data acquired from the City of Toronto and Environics Analytics (City of Toronto, n.d.).

Context

Using R Studio’s expansive tool set for map creation and Quarto documentation, we are going to be creating a thematic and an interactive map for parks and its association with Average Total Alcohol Expenditure in Toronto. The idea behind topic was really out of the blue. I was just thinking of a fun, simple topic that I wanted to do that I haven’t done yet for my other assignments! And so I landed on this because of data availability while learning some new skills at R Studio and try out the Quarto documentation process.

Data

  • Environics Analytics – Average Alcohol Expenditure (Shapefile for census tracts and in CAD $) (Environics Analytics, 2025
  • City of Toronto – Parks and Recreation Facilities (Point data and filtered down to 40 parks that participate in the program) (City of Toronto, 2011).

Methodology

  • Using R Studio to map out my Average Alcohol Expenditure and the 55 Parks that are a part of the Alcohol in Parks Program by the City of Toronto
  • Utilize tmap functions to create both a static thematic and interactive maps
  • Utilize Quarto documentation to create a readme file of my assignment
  • Showcasing the mapping capabilities and potential of R Studio as a mapping tool

Example tmap code for viewing maps

This tmap code for initializing what kind of view you want (there are only two kinds of views)

  • Static thematic map

## This is for viewing as a static map

## tmap_mode("plot") + tm_shape(Alcohol_Expenditure)

  • Interactive map

## This for viewing as a interactive map

## tmap_mode("view") + tm_shape(Alcohol_Expenditure)

Visualization process

Step 1: Installing and loading the necessary packages so that R Studio can recognize our inputs

  • These inputs are kind of like puzzle pieces! Where you need the right puzzle piece (package) so that you can put the entire puzzle together.
  • So we would need a bunch of packages to visualize our project:
    • sf
    • tmap
    • dplyr
  • These two packages are important because “sf” lets us read the shapefiles into R Studio. While “tmap” lets us actually create the maps. And “dplyr” lets us filter our shapefiles and the data inside it.
  • Also, its very likely that the latest R Studio version has the necessary packages already. In that case, you can just do the library() function to call the packages that you would need. But, I like installing them again in case I forgot.

## Code for installing the packages

## install.packages("sf")

## install.packages("tmap")

## Loading the packages

## library(tmap)

## library(sf)

We can see in our console that it says “package ‘sf’ successfully unpacked and MD5 sums checked. That basically means its done installing.

  • In addition, these warning messages in this console output indicates that we have these packages already in the latest R Studio software.

After installing and loading these packages, then we can begin with loading and filtering the dataset so that we can move on to visualizing the data itself. The results of installing these packages can be seen in our “Console” section at the bottom left hand side of R Studio (it may depend on the user but I have seen people move the “Console” section to the top right hand side of R Studio interface.

Step 2: Loading and filtering our data

  • We must first set the working directory of where our data is and where our outputs are going to go

## Setting work directory

## setwd()

  • This code basically points to where your files are going to be outputted to in your computer
  • Now that we set our working directory, we can load in the data and filter it

## Code for naming our variables in R Studio and loading it in the software

## Alcohol_Parks <- read_sf("Parks and Recreation Facilities - 4326.shp")

## Alcohol_Expenditure <- read_sf("SimplyAnalytics_Shapefiles_5efb411128da3727b8755e5533129cb52f4a027fc441d8b031fbfc517c24b975.shp")

  • As we can see in the code snippets above, we are using one of the functions that belong to the sf package. The read_sf basically loads in the data that we have to be recognized as a shapefile.
  • It will appear on the right as part of the “Environment” section. This means it has read all the columns that are part of the dataset

Now we can see our data in the Environments Section. And there’s quite a lot. But no worries we only need to filter the Parks data!

Step 3: Filtering the data

  • Since we only need to filter the data for the parks in Toronto, we only need to grab the data that are a part of the 55 parks in the Alcohol in Parks Program
  • This follows a two – step approach:
    • Name your variable to match its filtered state
    • Then the actual filtering comes into play

## Code for running the filtering process

## Alcohol_Parks_Filtered <- filter(Alcohol_Parks, ASSET_NAME == "ASHTONBEE RESERVOIR PARK" | ASSET_NAME == "BERT ROBINSON PARK"| ASSET_NAME == "BOND PARK" | ASSET_NAME == "BOTANY HILL PARK" | ASSET_NAME == "BYNG PARK"

  • As we can see in the code above, before the filtering process we name the new variable to match its filtered state as “Alcohol_Parks_Filtered”
    • In addition, we are matching the column name that we type out in the code to the park names that are found in the Park data set!
    • For example: The filtering wouldn’t work if it was “Bond Park”. It must be all caps “BOND PARK”
  • Then we used the filter() function to filter the shapefile by ASSET_NAME to pick out the 40 parks
  • We can see in our filtered dataset that we have filtered it down to 53 parks with all the original columns attached. Most important being the geometry column so we can conduct visualizations!
  • Once we completed that, we can test out the tmap function to see how the data looks before we map it out.

Step 4: Do some testing visualizations to see if there is any issues

  • Now, we can actually use some tmap functions to see if our data work
  • tm_shape is the function for recognizing what shapefile we are using to visualize the variable
  • tm_polygons and tm_dots is for visualizing the variables as either a polygon or dot shapefile
  • For tm_polygons, fill and style is basically what columns are you visualizing the variable on and what data classification method you would like to use

## Code for testing our visualizations

## tm_shape(Alcohol_Expenditure) + tm_polygons(fill = "VALUE0", style = "jenks")

## tm_shape(Alcohol_Parks_Filtered) + tm_dots()

Now, we can see that it actually works! We can see that the map above is our shapefile and the one on the bottom is our parks!

Step 5: Using tmap and its extensive functions to build our map

  • We can now fully visualize our map and add all the cartographic elements necessary to flesh it out and make it as professional as possible

## Building our thematic map

##``tmap_mode("plot") + tm_shape(Alcohol_Expenditure) +

tm_polygons(fill = "VALUE0", fill.legend = tm_legend ("Average Alcohol Expenditure ($ CAD)"), fill.scale = tm_scale_intervals(style = "jenks", values = "Greens")) +

tm_shape(Alcohol_Parks_Filtered) + tm_bubbles(fill = "TYPE", fill.legend = tm_legend("The 40 Parks in Alcohol in Parks Program"), size = 0.5, fill.scale = tm_scale_categorical(values = "black")) +

tm_borders(lwd = 1.25, lty = "solid") +

tm_layout(frame = TRUE, frame.lwd = 2, text.fontfamily = "serif", text.fontface = "bold", color_saturation = 0.5, component.autoscale = FALSE) +

tm_title(text = "Greenspaces and its association with Alcohol Expenditure in Toronto, CA", fontfamily = "serif", fontface = "bold", size = 1.5) +

tm_legend(text.size = 1.5, title.size = 1.2, frame = TRUE, frame.lwd = 1) +

tm_compass(position = c ("top", "left"), size = 4) +

tm_scalebar(text.size = 1, frame = TRUE, frame.lwd = 1) +

tm_credits("Source: Environics Analytics\nProjection: NAD83", frame = TRUE, frame.lwd = 1, size = 0.75)

  • Quite a lot of code!
  • Now this is where the puzzle piece analogy comes into play as well
    • First, we add our tmap_plot function to specify that we want it as a static map first
    • We add both our variables together because we want to see our point data and how it lies on top of our alcohol expenditure shapefile
    • Utilizing tm_polygons, tm_shape, and tm_bubbles to draw both our variables as polygons and as point data
      • tm_bubbles is dots and tm_polygons draws the polygons of our alcohol expenditure shapefile
    • The code that is in our brackets for those functions are additional details that we would like to have in our map
    • For example: fill.legend = tm_legend ("Average Alcohol Expenditure ($ CAD)")
      • This code snippet makes it so that our legend title is “Average Alcohol Expenditure ($ CAD) for our polygon shapefile
      • The same applies for our point data for our parks
    • Basically, we can divide our code into two sections:
      • The tm_polygons all the way to tm_bubbles is essentially drawing our shapefiles
      • The tm_borders all the way to the tm_credits are what goes on outside our shapefiles
        • For example:
    • tm_title() and the code inside it is basically all the details that can be modified for our map. component.autoscale = FALSE is turning off the automatic rescaling of our map so that I can have more control over modifying the title part of the map to my liking

Now we have made our static thematic map! On to the next part which is the interactive visualization!

Since we built our puzzle parts for the thematic map, we just need to switch it over to the interactive map using tmap_mode(“view”)

This code chunk describes the process to create the interactive map

library(tmap)
library(sf)
library(dplyr)


##Loading in the data to check if it works
Alcohol_Parks <- read_sf("Parks and Recreation Facilities - 4326.shp")
Alcohol_Expenditure <- read_sf("SimplyAnalytics_Shapefiles_5efb411128da3727b8755e5533129cb52f4a027fc441d8b031fbfc517c24b975.shp")

#Filtering test_sf_point to show only parks where you can drink alcohol
Alcohol_Parks_Filtered <- 
  filter(Alcohol_Parks, ASSET_NAME == "ASHTONBEE RESERVOIR PARK" | ASSET_NAME == "BERT ROBINSON PARK"
                                 | ASSET_NAME == "BOND PARK" | ASSET_NAME == "BOTANY HILL PARK" | ASSET_NAME == "BYNG PARK"
                                 | ASSET_NAME == "CAMPBELL AVENUE PLAYGROUND AND PARK" | ASSET_NAME == "CEDARVALE PARK" 
                                 | ASSET_NAME == "CHRISTIE PITS PARK" | ASSET_NAME == "CLOVERDALE PARK" | ASSET_NAME == "CONFEDERATION PARK"
                                 | ASSET_NAME == "CORKTOWN COMMON" | ASSET_NAME == "DIEPPE PARK" | ASSET_NAME == "DOVERCOURT PARK"
                                 | ASSET_NAME == "DUFFERIN GROVE PARK" | ASSET_NAME == "EARLSCOURT PARK" | ASSET_NAME == "EAST LYNN PARK"
                                 | ASSET_NAME == "EAST TORONTO ATHLETIC FIELD" | ASSET_NAME == "EDWARDS GARDENS" | ASSET_NAME == "EGLINTON PARK"
                                 | ASSET_NAME == "ETOBICOKE VALLEY PARK" | ASSET_NAME == "FAIRFIELD PARK" | ASSET_NAME == "GRAND AVENUE PARK"
                                 | ASSET_NAME == "GORD AND IRENE RISK PARK" | ASSET_NAME == "GREENWOOD PARK" | ASSET_NAME == "G. ROSS LORD PARK"
                                 | ASSET_NAME == "HILLCREST PARK" | ASSET_NAME == "HOME SMITH PARK" | ASSET_NAME == "HUMBERLINE PARK" | ASSET_NAME == "JUNE ROWLANDS PARK"
                                 | ASSET_NAME == "LA ROSE PARK" | ASSET_NAME == "LEE LIFESON ART PARK" | ASSET_NAME == "MCCLEARY PARK" | ASSET_NAME == "MCCORMICK PARK" 
                                 | ASSET_NAME == "MILLIKEN PARK" | ASSET_NAME == "MONARCH PARK" | ASSET_NAME == "MORNINGSIDE PARK" | ASSET_NAME == "NEILSON PARK - SCARBOROUGH"
                                 | ASSET_NAME == "NORTH BENDALE PARK" | ASSET_NAME == "NORTH KEELESDALE PARK" | ASSET_NAME == "ORIOLE PARK - TORONTO" | ASSET_NAME == "QUEEN'S PARK"
                                 | ASSET_NAME == "RIVERDALE PARK EAST" | ASSET_NAME == "RIVERDALE PARK WEST" | ASSET_NAME == "ROUNDHOUSE PARK" | ASSET_NAME == "SCARBOROUGH VILLAGE PARK"
                                 | ASSET_NAME == "SCARDEN PARK" | ASSET_NAME == "SIR WINSTON CHURCHILL PARK" | ASSET_NAME == "SKYMARK PARK" | ASSET_NAME == "SORAREN AVENUE PARK"
                                 | ASSET_NAME == "STAN WADLOW PARK" | ASSET_NAME == "THOMSON MEMORIAL PARK" | ASSET_NAME == "TRINITY BELLWOODS PARK" | ASSET_NAME == "UNDERPASS PARK"
                                 | ASSET_NAME == "WALLACE EMERSON PARK" |  ASSET_NAME == "WITHROW PARK")  


##Now as a interactive map
tmap_mode("view") + tm_shape(Alcohol_Expenditure) + 
  
  tm_polygons(fill = "VALUE0", fill.legend = tm_legend ("Average Alcohol Expenditure ($ CAD)"), fill.scale = tm_scale_intervals(style = "jenks", values = "Greens")) +
  
  tm_shape(Alcohol_Parks_Filtered) + tm_bubbles(fill = "TYPE", fill.legend = tm_legend("The 55 Parks in Alcohol in Parks Program"), size = 0.5, fill.scale = tm_scale_categorical(values = "black")) + 
  
  tm_borders(lwd = 1.25, lty = "solid",) + 
  
  tm_layout(frame = TRUE, frame.lwd = 2, text.fontfamily = "serif", text.fontface = "bold", color_saturation = 0.5, component.autoscale = FALSE) +
 
   tm_title(text = "Greenspaces and its association with Alcohol Expenditure in Toronto, CA", fontfamily = "serif", fontface = "bold", size = 1.5) +
  tm_legend(text.size = 1.5, title.size = 1.2, frame = TRUE, frame.lwd = 1) +
  
  tm_compass(position = c("top", "right"), size = 2.5) + 
  
  tm_scalebar(text.size = 1, frame = TRUE, frame.lwd = 1, position = c("bottom", "left")) +
  
  tm_credits("Source: Environics Analytics\nProjection: NAD83", frame = TRUE, frame.lwd = 1, size = 0.75)

Link to viewing the interactive map: https://rpubs.com/Gab_Cal/Geovis_Project

  • The only differences that can be gleaned from this code chunk is that the tmap_mode() is not “plot” but instead set as “view”
    • For example: tmap_mode(“view”)

The map is now complete!

Results (Based on our interactive map)

  • Just based on the default settings for the interactive map, tmap includes a wide range of elements that make the map dynamic!
    • We have the zoom in and layer selection/basemap selection function on the top left
    • The compass that we created is shown in the top right
    • And the legend that we made is locked in at the bottom right
    • Our scalebar is also dynamic which changes scales when we zoom in and out
    • And our credits and projection section is also seen in the bottom right of our interactive map
    • We can also click on our layers to see the columns attached to the shapefiles
  • For example, we can click on the point data to see the id, LocationID, AssetID, Asset_Name, Type, Amenities, Address, Phone, and URL. While for our polygon shapefile we can see the spatial_id, name of the CT, and the alcohol spending value in that CT
  • As we can see in our interactive map, the areas that have the highest “Average Alcohol Expediture” lie near the upper part of the downtown core of Toronto
    • For example: The neighbourhoods that are dark green are Bridle Path-Sunnybrook-York Mills, Forest Hill North and South and Rosedale to name a few
  • However, only a few parks that are a park of the program reside in these high spending regions on alcohol
  • Most parks reside in census tracts where the alcohol expenditure is either the $500 to $3000 range
  • While there doesn’t seems to be much of an association, there is definitely more factors into play as to where people buy their alcohol or where they decide to consume it
  • Based on just visual findings:
    • For example: It’s possible that people simply do not drink in these parks even though its allowed. They probably find the comfort of their home a better place to consume alcohol
    • Or people don’t want to drink at a park when they could be doing more active group – like activities

References

Paint by Raster: Watercolour Cartography Illustrating Landform Expansion at Leslie Street Spit, Toronto (1972 – 2025)

Emma Hauser

Geovis Project Assignment, TMU Geography, SA8905, Fall 2025

Hi everyone, welcome to my final Geovisualization Project tutorial. With this project, I wanted to combine my love of watercolour painting with cartography. I used Catalyst Professional, ArcGIS Pro, and watercolours to transform Landsat imagery spanning the years 1972 to 2025 into blocks of colour representing periods of landform expansion at Leslie Street Spit. I also made an animated GIF to help illustrate the process.

Study Area

Just to give you a bit of background to the site, Leslie Spit is a manmade peninsula on the Toronto waterfront, made up of brick and concrete rubble from construction sites in Toronto starting in 1959. It was intended to be a port-related facility, but by the early 1970s, this use case was no longer relevant, and natural succession of vegetation had begun. The landform continued to expand through lakefilling, as did the vegetation and wildlife, and by 1995 the Toronto and Region Conservation Authority started enhancing natural habitats, founding Tommy Thompson Park.

Post Classification Change Detection

The Landsat program has been providing remotely sensed imagery since 1972, at which time the Baselands and “Spine Road” had been constructed. Pairs of Landsat images can be compared by classifying the pixels as land or water in Catalyst Professional using an unsupervised classification algorithm, and performing “delta” or post classification change detection in ArcGIS Pro using the Raster Calculator to determine areas that have undergone landform expansion in that time period. The tool literally subtracts the pixel values denoting land or water of a raster at an earlier date from a raster at a later date in order to compare them and detect change. If we perform this process seven times, up until 2025, we can get a near complete picture of the land base formation of the Spit and can visualize these changes.

Let’s begin!

Step 1: Data Collection from USGS EarthExplorer

The first step is to collect 9 images forming 7 image pairs from USGS EarthExplorer. I searched for images that had minimal cloud cover covering the extent of Toronto.

For the year 1985, we need to double up on images in order to transition from the Multispectral Scanner sensor with 60m resolution to the Thematic Mapper sensor with 30m resolution. 1980 MSS and 1985 MSS will form a pair, and 1985 TM and 1990 TM will form a pair.

Step 2: Data Processing in Catalyst Professional

Now we can begin processing our images. All images must be data merged either manually (using the Translate and Transfer Layers Utilities) or using the metadata MTL.txt files (using the Data Merge tool) to join each image band together and subset (using the Clipping/Subsetting tool) to the same extent. The geocoded extent is:

Upper left: 630217.500 P / 4836247.500 L
Lower right: 637717.500 P / 4828747.500 L

Using the 2025 image as an example, my window looked like this:

I started a new session of Unsupervised Classification and added two 8 bit channels.

I specified the K-Means algorithm with 20 maximum classes and 20 maximum iterations.

I used Post-Classification Analysis (Aggregation) to assign each of the 20 classes to an information class. These classes are Water and Land. I made sure all classes were assigned and I applied the result to the Output Channel.

I got this result:

I repeated this process for all images. For example, 1972 looked like this:

I saved all of the aggregation results as .pix files using the Clipping/Subsetting tool.

Step 3: Data Processing, Visualization, and GIF-making in ArcGIS Pro

We are ready to move onto our processing and visualization in ArcGIS Pro. Here, we will be performing the post classification or “delta” change detection.

I added the aggregation result .pix files to ArcGIS Pro. I exported the rasters to GRID format. The rasters now had values of 0 (No Data), 21 (Water), and 22 (Land). I used the Raster Calculator (Spatial Analyst) to subtract each earlier dated image from the next image in the sequence. So, 1974 minus 1972, 1976 minus 1974, and so on.

I got this result (with masking polygon included, explanation to follow):

The green (0) represents no change, the red (1) represents change from Water to Land (22 – 21), and the grey (-1) represents change from Land to Water (21 – 22).

I drew a polygon (shown in white) around the Spit so we can perform Extract by Mask (Spatial Analyst). This will clip the raster to a more specific extent.

I symbolized the extracted raster’s values of 0 and -1 with no colour and value 1 as red. We now have the first land area change raster for 1972 to 1974.

I repeated this for all time periods, symbolizing the portions of the raster with value 1 as orange, yellow, green, blue, indigo, and purple.

We can now begin our animation. I assigned each change raster its appropriate time period in the Layer Properties. A time slider appeared at the top of my map.

I added a keyframe for each time period to my animation by sliding to the correct time and pressing the green “+” button on the timeline. I used Fixed transitions of 1.5 seconds for each Key Length and extra time (3.0 seconds) at beginning and end to showcase the base raster and the finished product.

I added overlays (a legend and title) to my map. I ensured the Start Key was 1 (first) and the End Key was 9 (last) so that the overlays were visible throughout the entire 13.5 second animation.

I exported the animation as a GIF – voila!

Step 4: Watercolour Map Painting

To begin my watercolour painting, I used these materials:

  • Pencil and eraser
  • Drafting scale (or ruler)
  • Watercolour paper (Fabriano, cold press, 25% cotton, 12” x 15.75”)
  • Watercolour brushes (Cotman and Deserres)
  • Watercolour palettes (plastic and ceramic)
  • Watercolour drawing pad for test colour swatches
  • Water container
  • Lightbox (Artograph LightTracer)
  • Leslie Spit colour-printed reference image
  • Black India ink artist pen (Faber-Castell, not pictured)
  • Masking tape (not pictured)
  • Lots of natural light
  • JAZZ FM 91.1 playing on radio (optional)

I first sketched out in pencil some necessary map elements on the watercolour paper: title, subtitle, neatline, legend, etc. I then taped the reference image down onto the lightbox, and then taped the watercolour paper overtop.

I mixed colour and water until I achieved the desired hues and saturations.

From red to purple, I painted colours one by one, using the reference illuminated through the lightbox. When the last colour (purple) was complete, I added the Baselands and Spine Road in grey as well as all colours for the legend.

To achieve the final product, I added light grey paint for the surrounding land and used a black artist pen to go over my pencil lines and add a scale bar and north arrow.

The painting is complete – I hope you enjoyed this tutorial!

Evolution of Residential Real Estate in Toronto – 2014 to 2022

Shashank Prabhu, Geovis Project Assignment, TMU Geography, SA8905, Fall 2024 

Introduction
Toronto’s residential real estate market has experienced one of the most rapid price increases among major global cities. This surge has led to a significant affordability crisis, impacting the quality of life for residents. My goal with this project was to explore the key factors behind this rapid increase, while also analyzing the monetary and fiscal policies implemented to address housing affordability.

The Approach: Mapping Median House Prices
To ensure a more accurate depiction of the market, I used the median house price rather than the average. The median better accounts for outliers and provides a clearer view of housing trends. This analysis focused on all home types (detached, semi-detached, townhouses, and condos) between 2014 and 2022.

Although data for all years were analyzed, only pivotal years (2014, 2017, 2020, and 2022) were mapped to emphasize the factors driving significant changes during the period.

Data Source
The Toronto Regional Real Estate Board (TRREB) was the primary data source, offering comprehensive market watch reports. These reports provided median price data for Central Toronto, East Toronto, and West Toronto—TRREB’s three primary regions. These regions are distinct from the municipal wards used by the city.

Creating the Maps

Step 1: Data Preparation
The Year-to-Date (YTD) December figures were used to capture an accurate snapshot of annual performance. The median price data for each of the years across the different regions was organized in an Excel sheet, joined with TRREB’s boundary file (obtained through consultation with the Library’s GIS department), and imported into ArcGIS Pro. WGS 1984 Web Mercator projection was used for the maps.

Step 2: Visualization with 3D Extrusions
3D extrusions were used to represent price increases, with the height of each bar corresponding to the median price. A green gradient was selected for visual clarity, symbolizing growth and price.

Step 3: Overcoming Challenges

After creating the 3D extrusion maps for the respective years (2014, 2017, 2020, 2022), the next step was to export those maps to ArcOnline and then to Story Maps, the easiest way of doing so was to export it as a Web Scene, from which it would show up under the Content section on ArcOnline.

  • Flattened 3D Shapes: Exporting directly as a Web Scene to add onto Story Maps caused extrusions to lose their 3D properties. This was resolved using the “Layer 3D to Feature Class” tool.

  • Lost Legends: However, after using the aforementioned tool, the Legends were erased during export. To address this, static images of the legends were added below each map in Story Maps.

Step 4: Finalizing the Story Map
After resolving these issues, the maps were successfully exported using the Export Web Scene option. They were then embedded into Story Maps alongside text to provide context and analysis for each year.

Key Insights
The project explored housing market dynamics primarily through an economic lens.

  • Interest Rates: The Bank of Canada’s overnight lending rate played a pivotal role, with historic lows (0.25%) during the COVID-19 pandemic fueling a housing boom, and sharp increases (up to 5% by 2023) leading to market cooling.
  • Immigration: Record-breaking immigration inflows also contributed to increased demand, exacerbating the affordability crisis.

While earlier periods like 2008 were critical in shaping the market, boundary changes in TRREB’s data made them difficult to include.

Conclusion
Analyzing real estate trends over nearly a decade and visualizing them through 3D extrusions offers a profound insight into the rapid rise of residential real estate prices in Toronto. This approach underscores the magnitude of the housing surge and highlights how policy measures, while impactful, have not fully addressed the affordability crisis.

The persistent rise in prices, even amidst various interventions, emphasizes the critical need for increased housing supply. Initiatives aimed at boosting the number of housing units in the city remain essential to alleviate the pressures of affordability and meet the demands of a growing population.

Link to Story Map (You will need to sign in through your TMU account to view it): https://arcg.is/WCSXG

3D String Mapping and Textured Animation: An Exploration of Subway Networks in Toronto and Athens

BY: SARAH DELIMA

SA8905 – Geovis Project, MSA Fall 2024

INTRODUCTION:

Greetings everyone! For my geo-visualization project, I wanted to combine my creative skills of Do It Yourself (DIY) crafting with the technological applications utilized today. This project was an opportunity to be creative using resources I had from home as well as utilizing the awesome applications and features of Microsoft Excel, ArcGIS Online, ArcGIS Pro, and Clipchamp.

In this blog, I’ll be sharing my process for creating a 3D physical string map model. To mirror my physical model, I’ll be creating a textured animated series of maps. My models display the subway networks of two cities. The first being the City of Toronto, followed by the metropolitan area of Athens, Greece.

Follow along this tutorial to learn how I completed this project!

PROJECT BACKGROUND:

For some background, I am more familiar with Toronto’s subway network. Fortunately enough, I was able to visit Athens and explore the city by relying on their subway network. As of now, both of these cities have three subway lines, and are both undergoing construction of additional lines. My physical model displays the present subway networks to date for both cities, as the anticipated subway lines won’t be opening until 2030. Despite the hands-on creativity of the physical model, it cannot be modified or updated as easily as a virtual map. This is where I was inspired to add to my concept through a video animated map, as it visualizes the anticipated changes to both subway networks!

PHYSICAL MODEL:

Materials Used:

  • Paper (used for map tracing)
  • Pine wood slab
  • Hellman ½ inch nails
  • Small hammer
  • Assorted colour cotton string
  • Tweezers
  • Krazy glue

Methods and Process:

For the physical model, I wanted to rely on materials I had at home. I also required a blank piece of paper for a tracing the boundary and subway network for both cities. This was done by acquiring open data and inputting it into ArcGIS Pro. The precise data sets used are discussed further in my virtual model making. Once the tracings were created, I taped it to a wooden base. Fortunately, I had a perfect base which was pine wood. I opted for hellman 1/2 inch nails as the wood was not too thick and these nails wouldn’t split the wood. Using a hammer, each nail was carefully placed onto the the tracing outline of the cities and subway networks .

I did have to purchase thread so that I could display each subway line to their corresponding colour. The process of placing the thread around the nails did require some patience. I cut the thread into smaller pieces to avoid knots. I then used tweezers to hold the thread to wrap around the nails. When a new thread was added, I knotted it tightly around a nail and applied krazy glue to ensure it was tightly secured. This same method was applied when securing the end of a string.

Images of threading process:

City of Toronto Map Boundary with Tracing

After threading the city boundary and subway network, the paper tracing was removed. I could then begin filling in the space of the boundary. I opted to use black thread for the boundary and fill, to contrast both the base and colours of the subway lines. The City of Toronto thread map was completed prior to the Athens thread map. The same steps were followed. Each city is on opposite sides of the wood base for convenience and to minimize the use of an additional wood base.

Of course, every map needs a title , legend, north star, projection, and scale. Once both of the 3D string maps were complete, the required titles and text were printed and laminated and added to the wood base for both 3D string maps. I once again used the nails and hammer with the threads to create both legends. Below is an image of the final physical products of my maps!

FINAL PHYSICAL MODELS:

City of Toronto Subway Network Model:

Athens Metropolitan Area Metro Network Model:

VIRTUAL MODEL:

To create the virtual model, I used ArcGIS Pro software to create my two maps and apply picture fill symbology to create a thread like texture. I’ll begin by discussing the open data acquired for the City of Toronto, followed by the Census Metropolitan Area of Athens to achieve these models.

The City of Toronto:

Data Acquisition:

For Toronto, I relied on the City of Toronto open data portal to retrieve the Toronto Municipal Boundary as well as TTC Subway Network dataset. The most recent dataset still includes Line 3, but was kept for the purpose of the time series map. As for the anticipated Eglinton line and Ontario line, I could not find open data for these networks. However, Metrolinx created interactive maps displaying the Ontario Line and Eglinton Crosstown (Line 5) stations and names. To note, the Eglinton Crosstown is identified as a light rail transit line, but is considered as part of the TTC subway network. 

To compile the coordinates for each station for both subway routes, I utilized Microsoft Excel to create 2 sheets, one for the Eglinton line and one for the Ontario line. To determine the location of each subway station, I used google maps to drop a pin in the correct location by referencing the map visual published by Metrolinx. 

Ontario Line Excel Table :

Using ArcGIS Pro, I used the XY Table to Point tool to insert the coordinates from each separate excel sheet, to establish points on the map. After successfully completing this, I had to connect each point to create a continuous line. For this, I used the Point to Line tool also in ArcGIS Pro.

XY Table to Point tool and Points to Line tool used to add coordinates to map as points and connect points into a continuous line to represent the subway route:

After achieving this, I did have to adjust the subway routes to be clipped within the boundary for The City of Toronto as well as Athens Metropolitan Area. I used the Pairwise Clip in the Geoprocessing pane to achieve this.

Geoprocessing pairwise clip tool parameters used. Note: The input features were the subway lines withe the city boundary as the clip features.

Athens Metropolitan Area:

Data Acquisition:

For retrieving data for Athens, I was able to access open data from Athens GeoNode I imported the following layers to ArcGIS Online; Athens Metropolitan Area, Athens Subway Network, and proposed Athens Line 4 Network which I added as accessible layers to ArcGIS online. I did have to make minor adjustments to the data, as the Athens metropolitan area data displays the neighbourhood boundaries as well. For the purpose of this project, only the outer boundaries were necessary. To overcome this, I used the merge modify feature to merge all the individual polygons within the metropolitan area boundary into one. I also had to use the pairwise clipping tool once again as the line 4 network exceeds the metropolitan boundary, thus being beyond the area of study for this project.

Adding Texture Symbology:

ArcGIS has a variety of tools and features that can enhance a map’s creativity and visualization. For this project , I was inspired by an Esri Yarn Map Tutorial. Given the physical model used thread, I wanted to create a textured map with thread. To achieve this, I utilized the public folder provided with the tutorial. This included portable network graphics (.png) cutouts of several fabrics as well as pen and pencil textures. To best mirror my physical model, I utilized a thread .png.

ESRI yarn map tutorial public folder:

I added the thread .png images by replacing the solid fill of the boundaries and subway networks with a picture fill. This symbology works best with a .png image for lines as it seamlessly blends with the base and surrounding features of the map. The thread .png image uploaded as a white colour, which I was able to modify its colour according to the boundary or particular subway line without distorting the texture it provides. 

For both the Toronto and Athens maps, the picture fill for each subway line and boundary was set to a thread .png with its corresponding colour. The boundaries for both maps were set to black as in the physical model, where the subway lines also mirror the physical model which is inspired by the existing/future colours used for subway routes. Below displays the picture symbology with the thread .png selected and tint applied for the subway lines.

City of Toronto subway Networks with picture fill of thread symbology applied:

The base map for the map was also altered, as the physical model is placed on a wood base. To mirror that, I extracted a Global Background layer from ArcGIS online, which I modified using the picture fill to upload a high resolution image of pine wood to be the base map for this model. For the city boundaries for both maps, the thread .png imagery was also applied with a black tint.

PUTTING IT ALL TOGETHER:

After creating both maps for Toronto and Athens, it was time to put it into an animation! The goal of the animation was to display each route, and their opening year(s) to visually display the evolution of the subway system, as my physical model merely captures the current subway networks. 

I did have to play around with the layers to individually capture each subway line. The current subway network data for both Toronto and Athens contain all 3 of their routes in one layer, in which I had to isolate each for the purpose of the time lapse in which each route had to be added in accordance to their initial opening date and year of most recent expansion. To achieve this, I set a Definition Query for each current subway route I was mapping whilst creating the animation.

Definition query tool accessed under layer properties:

Once I added each keyframe in order of the evolution of each subway route, I created a map layout for each map to add in the required text and titles as I did with the physical model. The layouts were then exported into Microsoft Clipchamp to create the video animation. I imported each map layout in .png format. From there, I added transitions between my maps, as well as sound effects !

CITY OF TORONTO SUBWAY NETWORK TIMELNE:

Geovis Project, TMU Geography, SA8905 Sarah Delima

(@s1delima.bsky.social) 2024-11-19T15:05:37.007Z

ATHENS METROPOLITAN AREA METRO TIMELINE:

Geovis Project, TMU Geography, SA8905 Sarah Delima

(@s1delima.bsky.social) 2024-11-19T15:12:18.523Z

LIMITATIONS: 

While this project allowed me to be creative both with my physical and virtual models, it did present certain limitations. A notable limitation to this geovisualization for the physical model is that it is meant to be a mere visual representation of the subway networks.

As for the virtual map, although open data was accessible for some of the subway routes, I did have to manually enter XY coordinates for future subway networks. I did reference reputable maps of the anticipated future subway routes to ensure accuracy.  Furthermore, given my limited timeline, I was unable to map the proposed extensions of current subway routes. Rather, I focused on routes currently under construction with an anticipated completion date. 

CONCLUSION: 

Although I grew up applying my creativity through creating homemade crafts, technology and applications such as ArcGIS allow for creativity to be expressed on a virtual level. Overall, the concept behind this project is an ode to the evolution of mapping, from physical carvings to the virtual cartographic and geo-visualization applications utilized today.

Visualizing Population on a 3D-Printed Terrain of Ontario

Xingyu Zeng

Geovisual Project Assignment @RyersonGeo, SA8905, Fall 2022

Introduction

3D visualization is an essential and popular category in geovisualization. After a period of development, 3D printing technology has become readily available in people’s daily lives. As a result, 3D printable geovisualization project was relatively easy to implement at the individual level. Also, compared to electronic 3D models, the advantages of explaining physical 3D printed models are obvious when targeting non-professional users.

Data and Softwares

3D model in Materialise Magics
  • Data Source: Open Topography – Global Multi-Resolution Topography (GMRT) Data Synthesis
  • DEM Data to a 3D Surface: AccuTrans 3D – which provides translation of 3D geometry between the formats used by many 3D modeling programs.
  • Converting a 3D Surface to a Solid: Materialise Magics – Converting surface to a solid with thickness and the model is cut according to the boundaries of the 5 Transitional Regions of Ontario. Using different thicknesses representing the differences in total population between Transitional Regions. (e.g. The central region has a population of 5 million, and the thickness is 10 mm; the west region has a population of 4 million the thickness is 8 mm)
  • Slicing & Printing: This step is an indispensable step for 3D printing, but because of the wide variety of printer brands on the market, most of them have their own slicing software developed by the manufacturers, so the specific operation process varies. But there is one thing in common, after this step, the file will be transferred to the 3D printer, and what follows is a long wait.

Visualization

The 5 Transitional Regions is reorganized by the 14 Local Health Integration Network (LHIN), and the corresponding population and model heights (thicknesses) for each of the five regions of Ontario are:

  • West, clustering of: Erie-St. Clair, South West, Hamilton Niagara Haldimand Brant, Waterloo Wellington, has a total population of about 4 million, the thickness is 8mm.
  • Central, clustering of: Mississauga Halton, Central West, Central, North Simcoe Muskoka, has a total population of about 5 million, the thickness is 10mm.
  • Toronto, clustering of: Toronto Central, has a total population of about 1.4 million, the thickness is 2.8mm.
  • East, clustering of: Central East, South East, Champlain, has a total population of about 3.7 million, the thickness is 7.4mm.
  • North, clustering of: North West, North East, has a total population of about 1.6 million, the thickness is 3.2mm.
Different thicknesses
Dimension Comparison
West region
Central region
Toronto
East region
North region

Limitations

The most unavoidable limitation of 3D printing is the accuracy of the printer itself. It is not only about the mechanical performance of the printer, but also about the materials used, the operating environment (temperature, UV intensity) and other external factors. The result of these factors is that the printed models do not match exactly, even though they are accurate on the computer. On the other hand, the 3D printed terrain can only represent variables that can be presented by unique values, such as the total population of my choice.

Toronto’s Rapid Transit System Throughout the Years, 1954 to 2030: Creating an Animated Map on ArcGIS Pro

Johnson Lumague

Geovis Project Assignment @RyersonGeo, SA8905, Fall 2022

Background

Toronto’s rapid transit system has been constantly growing throughout the decades. This transit system is managed by the Toronto Transit Commission (TTC) which has been operating since the 1920s. Since then, the TTC has reached several milestones in rapid transit development such as the creation of Toronto’s heavy rail subway system. Today, the TTC continues to grow through several new transit projects such as the planned extension of one of their existing subway lines as well as by partnering with Metrolinx for the implementation of two new light rail systems. With this addition, Toronto’s rapid transit system will have a wider network that spans all across the city.

Timeline of the development of Toronto’s rapid transit system

Based on this, a geovisualization product will be created which will animate the history of Toronto’s rapid transit system and its development throughout the years. This post will provide a step-by-step tutorial on how the product was created as well as showing the final result at the end.

Continue reading Toronto’s Rapid Transit System Throughout the Years, 1954 to 2030: Creating an Animated Map on ArcGIS Pro