Developing a 3-Dimensional Model of the Everest Region in Nepal Using Blender

By Matthew Abraham for SA8905 Geovis course project (Dr. Rinner)

This Geovisualization project developed a working 3D model of the Sagarmatha region using graphics and animation software, Blender, culminating in a fly-through of the region outlining all mountains above 8,000m. The purpose of this blog is to detail the steps needed to develop a 3D model of any desired region around the world, using Blender, and is therefore not limited to this one example.

Blender is a free open-source graphics and animation program that has many uses beyond what was explored in this project. Many of their open film projects can be seen on their website at https://www.blender.org/features/projects/. Since this program has incredible diversity in its applications and can create photorealistic imagery, I chose it to produce a 3D mapped mountain environment of the Everest region in Nepal, combining both graphic design and geospatial data. It should be noted that this entire project was done in Blender Cycles (a version of Blender).Technology Blog Pics

This process from geography to 3D model included four critical steps and involved two core programs. The steps were:

  1. Collect geospatial data and identify the size of the region analyzed;
  2. Process this geospatial data within a geographic information system (GIS) – Quantum GIS;
  3. Convert the map to a 3D model using Blender – an open-source graphics animation program; and
  4. Process and develop a fly-through of the desired mountains.

The first step involved extracting digital elevation model (DEM) data from the US Geological Survey website, using http://earthexplorer.usgs.gov/ to define the region of interest. Using a simple search for Mount Everest on Earth Explorer, the region was pulled up. Once the region was located, multiple points were used to help define the regions of interest for data acquisition. Once the region was selected, ASTER DEM data was pulled for all four Lat/Long regions identified by Earth Explorer.

Technology Blog pic 2

After downloading the DEM data, it was uploaded into QGIS to merge and crop the four DEM layers to develop the zone of interest. Raster –> Miscellaneous –> Merge was used in order to combine the underlying four DEM layers into one crop-able sheet. Next, a Raster Clipper tool was used to select the desired region from the merged DEM layer as shown below. This clipped section was saved as a TIFF file to be imported into Blender.

Technology Blog pic 3

Once the desired DEM region was converted into a TIFF file, the work could begin in Blender. Upon opening up an empty project in Blender, the user is given an empty canvas with a cube in the center of the 3D matrix as seen here:

Technology Blod pic 3

The first step involved deleting the cube by pressing X and then clicking “delete”. Next, it was necessary to bring in a blank plane to display the geospatial data. This was done by using the shortcut, Shift-A, and then selecting under Mesh –> Plane. This produced a blank plane in the centre of the grid, where the cube was located.

Tech Blog pic 5

The next step was to subdivide the plan into multiple grids. This was done in the Edit Mode by hitting tab and then scrolling down in the Transform sidebar to Subdivide. Subdivide breaks the plane down into as many smaller planes as desired, however the more subdivisions there are, the more information and challenge it is for your computer to handle the detail. For the purpose of this assignment and the limitations of my computer, 500 subdivisions were made to the plane creating over 250,000 squares and 370,000 vertices.

Once the plane was subdivided, the plane was scaled up to a more appropriate size. In order to make it easier to scale up, units were given to the plane by going to the Scene tab and changing the units to “Metric”. In the Transform sidebar, the plane was scaled up to 500 by 500 meters. Although this is not the actual scale of the DEM region we are looking at, it provide enough size and detail to appropriately map the desired region.

After the plane as set up and given an appropriate scale, it was necessary to import the geospatial data onto this plane. This was done by going to the Modifier tab and then selecting a “Displace” modifier from the pull-down menu. Click “New Texture”, and then under Type select “Image or Movie”. Under image, select the TIFF file saved earlier.

pic8Tech Blog 7

The plane at this point will not show the features, mainly due to the strength of the displacement. This can be adjusted by going back to the Modifier tab and changing the strength of the displace modifier. The strength can be adjusted until the desired look is achieved. It was also necessary to adjust the Z-axis location to be half of the displaced value in order to account for the displacement effects. The following was the result:

Tech Blog pic 6

The next step was adding texture to this terrain to give it some realistic definition and colouring. There are multiple methods for texturing in Blender. One step explored in this project was the use of a colour ramp based on vertical height. This was all done in the Node Editor and involved multiple nodes. This first node was a Texture Coordinate, which tells Blender what object the colours will apply to. In this case “Generated” was selected, as it would automatically generate the colours on the desired object. Next a Separate XYZ node was used to separate the desired Z-axis to create a vertical layering of the selected colours. After separating the Z-axis, a Mapping node was added to help further identify the locations of the colours on the object, specifying the Z-axis under Texture. Next a Gradient Texture was used alongside a ColorRamp node to develop the desired colour ranges for the 3D plane. Colours were chosen based on personal examination of the mapped region, going from a dark green and brown for low-lying forests to white for snow-capped peaks. This is all part of a Diffuse BSDF, which is a tool that creates the material for the desired plane. The resulting rendered image shows how the colour ramp looks on the 3D plane.

Pic9

Pic10

The second last thing to add prior to creating the fly-through was the sky texture, which was done by simply going to the World tab and under Surface and selecting “Background” –> “Sky Texture”. The intensity of the Sun and location can be altered using the provided Vector and Colour mapping options. For this project, Ambient Occlusion was turned on as it added more realism to the lighting of the 3D plane.

Lastly, text was added to the map to identify the four mountains above 8,000 meters. This was done through Shift-A, and selecting “Text”. The scale of the text was adjusted using the same method as done for the plane. Next, the text was given 3-dimensions through the Depth option, and the appropriate text was written in Edit Mode. This was then rotated and moved to a location on the 3D model as shown here:

Pic 11

Once the map was ready, I added camera animations to fly-through the mountains. This was done by first creating a path for the camera to rest on. Once again, this was accessed by pressing Shift-A, going to Curve, and selecting “Path” at the bottom. The desired path can then be scaled and shaped by moving the XYZ vector points of the line.

Pic 12

Once this path was adjusted to the appropriate location, I added a Follow Path object constraint to the camera under the Constraints tab. After adding the constraint, change Forward direction to “–Z”, and Up to “Y”. In addition to fixing the camera’s orientation, I defined the position on the path by pressing “I” on Offset at “0.000”. Next, I went to Video Sequence view and moved the current frame to the desired end frame (in this case 1600) and then changed the Offset value to “1.000” and once again pressed “I”. This sets the end of the path of the camera at the end frame, 1600.

Next it was necessary to add an Empty object that the camera can track throughout the animation. Once again, Shift-A was used to select “Empty” and then “Plain Axis”. Another object constraint needed to be added to the camera, this time a Track To constraint, which used the same Forward as “–Z” and Up as “Y”. The camera should now be on the path created and pointed at the plain axis. This plain axis could be positioned at any desired region on the map.

Pic 13

For this project I put this plain axis near the peak of the first mountain and moved it throughout the animation. For each movement, the plain axis was selected at a desired frame by pressing “R” and selecting “Location”. The object was then moved to the second mountain and the frame was adjusted to approximately 200 frames later. Once again, the plain axis was selected by pressing “R” and selecting “Location”. This identified that the camera needed to move to the new location of the plain axis, giving it 200 frames to do so. This process was done twice more to capture the remaining mountains.

At this point it was possible to animate the camera fly-through. In order to do so, a few minor things needed to be done in the Render Tab, including defining the desired resolution (1080p in this example), setting the Start and End Frame to 1 and 1600 respectively, making the Frame Rate 24 frames per second, and choosing an output type. The project was first rendered as pictures or .png files because if the render process crashed, you would be able to continue rendering from the individual picture frame it crashed at. In addition, under Sampling, the samples were increased to help reduce any picture noise caused by light scattering. “Render Animation” was then clicked after all the settings were finalized. The rendering process varies in length and depends on the number of samples, detail of the images, and number of frames. For this project, the render took around 4 hours.

Pic 14Pic 15

 

 

 

 

 

After the rendering process was complete it produced 1600 individual pictures, which were loaded into Blender’s Video Sequence Editor by locating the folder the render output was saved in and selecting all the image frames. Once uploaded into the Video Sequence Editor, the output type was changed from a .png image to an h.264 file, which is a video output. “Lossless Output” was also selected, and is found under Encoding. Lossless Output ensures that there is no compression in the photos between the original frame and the new video output. This produced the video file of the entire project.

Pic 16         This example demonstrated how to create a 3D model of the Sagarmatha region in Nepal and create a detailed fly-through of the region using the graphics and animation program Blender. This same process can be applied to anywhere in the world with DEM data.

Pic 17

Thanks for Reading!

 

3D Hexbin Map Displaying Places of Worship in Toronto

Produced by: Anne Christian
Geovis Course Assignment, SA8905, Fall 2015 (Rinner)

Toronto is often seen as the city of many cultures, and with different cultures often come different beliefs. I wanted to explore the places of worship in Toronto and determine what areas have the highest concentrations versus the lowest concentrations. As I explored the different ways to display this information in a way that is effective and also unique, I discovered the use of hexbin maps and 3D maps. While doing some exploratory analysis, I discovered that while hexbin maps have been created before and 3D maps have been printed before, I was unable to find someone who has printed a 3D hexbin prism map, so I decided to take on this endeavor.

Hexbin maps are a great alternative technique for working with large data sets, especially point data. Hexagonal binning uses a hexagon shape grid, and allows one to divide up space in a map into equal units and display the information (in this case the places of worship) that falls within each unit (in this case hexagon grids). The tools used to create this project include QGIS, ArcGIS, and ArcScene, although it could probably be completed entirely within QGIS and other open-source software.

Below are the specific steps I followed to create the 3D hexbin map:

  1. Obtained the places of worship point data (2006) from the City of Toronto’s Open Data Catalogue.
  2. Opened QGIS, and added the MMQGIS plugin.
  3. Inputted the places of worship point data into QGIS.
  4. Used the “Create Grid Lines Layer” tool (Figure 1) and selected the hexagon shape, which created a new shapefile layer of a hexagon grid.

    Figure 1: Create Grid Lines Layer Tool
  5. Used the “Points in Polygon” tool (Figure 2) which counts the points (in this case the places of worship) that fall within each hexagon grid. I chose the hexagon grid as the input polygon layer and the places of worship as the input point layer. The number of places of worship within each hexagon grid was counted and added as a field in the new shapefile.

    Figure 2: Points in Polygon Tool
  6. Inputted the created shapefile with the count field into ArcGIS.
  7. Obtained the census tract shapefile from the Statistics Canada website (https://www12.statcan.gc.ca/census-recensement/2011/geo/bound-limit/bound-limit-2011-eng.cfm) and clipped out the city of Toronto.
  8. Used the clip tool to include only the hexagons that are within the Toronto boundary.
  9. Classified the data into 5 classes using the quantile classification method, and attributed one value for each class so that there are only 5 heights in the final model. For example, the first class had values 0-3 in it, and the value I attributed to this class was 1.5. I did this for all of the classes.
  10. The hexagons for the legend were created using the editor toolbar, whereby each of the 5 hexagons were digitized and given a height value that matched with the map prism height.
  11. Inputted the shapefile with the new classified field values into ArcScene, and extruded the classified values and divided the value by 280 because this height works well and can be printed in a timely manner.
  12. Both the legend and hexagonal map shapefile were converted into wrl format in Arcscene. The wrl file was opened in Windows 10 3D Builder and converted into STL format.
  13. This file was then brought to the Digital Media Experience (DME) lab at Ryerson, and the Printrbot Simple was used to print the model using the Cura program. The model was rescaled where appropriate. My map took approximately 3 hours to print, but the time can vary depending on the spatial detail of what is being printed. The legend took approximately 45 minutes. Below is a short video of how the Printrbot created my legend. A similar process was used to created the map.

The final map and legend (displayed in the image below) provide a helpful and creative way to display data. The taller prisms indicate areas with the most places of worship, and the shorter prisms indicate the areas in Toronto with the least places of worship. This hexagonal prism map allows for effective numerical comparisons between different parts of Toronto.

IMG_5392

Use of a Laser Cutter to Create a 3D Bathymetric Chart

Mallory Carpenter,  SA8905 Geovisualization Assignment, Fall 2015

Bathymetric, or depth data collected about oceans and other water bodies are typically displayed in one of two ways –  as a bathymetric chart, or as a depth raster.  New technologies such as 3D printers and laser cutters allow for the better communication of depth data. Laser cutters in particular allow for “etching,” which can simultaneously communicate topographic data.  This allows the viewer to better situate themselves in the landscape.  Examples of this can be seen here and here.

A fjord is a coastal feature formed by glaciers.  Typically, they contain steep vertical sidewalls, and deep basins separated by shallow sills (ridges of bedrock which rise to depths of less than 50 m).  Mapping Nachvak Fjord in 3D, located in the Torngat Mountains in Labrador, will help to better illustrate the unique bathymetric features.

The basic process is this:

  • Collection and processing of bathymetric data into useable raster format.
  • Importation of the raster data into GIS software.
  • The creation and export of contour data as vector files to secondary graphics.
  • The division of contours into separate layers, and the addition of any graphics for “etching.”
  • Different colours in the vector file are used to differentiate between etching and cutting.

The screenshots below show the bathymetric data collected between 2003 and 2009 by the Canadian Hydrographic Service and ArcticNet. The data are available for free for download from the Ocean Mapping Group website. The spatial resolution of the data is 5×5 m with a vertical accuracy of 1 m. The data ranges in depth from 211 m to 1 m.  Contours were created at 20 m intervals, smoothed and exported as vector files.
The data used for etching the topographic map on the top layer are a product called CanVec, which is downloadable for free from Geogratis. The contour interval was reduced to 200 m to improve visibility. Extraneous shapefiles such as points were removed.

Nachvak1

The data were manipulated in iDraw (a Mac-based vector graphics program) to smooth out overlapping lines and crop to an appropriate area as shown in the following screenshot.

Nachvak2

The laser printer has a 2 x 4 foot printing bed.  In order to save materials and cutting time, layers need to be nested in the bed space, colour coded for cutting and etching, and exported as either a PDF or SVG.  Each contour makes up a layer – with a solid rectangle for the base, and the topographic information etched into the top layer.  The following screenshot shows two cutting surfaces, each with 5 map layers.

Nachvak5

nachvak4

The laser cutting was done at the Danforth Tool Library (http://torontotoollibrary.com), out of 1/4 inch Birch Plywood.  They were cleaned (the cutting produces soot), stained, and glued together with carpenter glue.

Nachvak5

Initial plans included the use of etching to detail habitat and substrate information.  Time and finanical constraints limited the amount of etching work that could be done.  Additionally, if the project were repeated it could be worth either using thinner materials, or increasing the contour interval.  The slope on the side walls is so steep, and the fiord so narrow that the fine details are hard to see in the final version.

Nachvak7

 

Creating a Vintage Bathymetric Map with QGIS

Creator: Jenny Mason, Ryerson University Graduate Spatial Analysis Student, for Geovis course project, SA8905 (Dr. Rinner)

My geovisualization, in a sense, goes back in time. Despite many data and geo-visualizations recently showing how technology has changed the way we view spatial data, I wanted to create a geovisualization that tried to replicate the roots of mapping technology. I chose to create a vintage bathymetric survey map using QGIS.

Based on the numbers of tutorials and blog posts available online, replicating the characteristics and design of a vintage map using modern GIS mapping technology is popular for GIS enthusiasts. Using a combination of these geographer’s ideas, as well as my own, I chose to replicate the timeless style of these maps.

The final map product: 

Vintage Bathymetric Map Using QGIS

Technology: QGIS offers rendering options that blend your layers (map data, elements, text and desired stain/finish) in the Print Composer. For this look, the multiply function was used.

Data source: The Great Lakes bathymetry contours data were provided by Esri Canada for educational purposes only. They come in Shapefile format and it looks as if they were vectorized from the National Oceanic and Atmospheric Administration’s (NOAA) bathymetry grid, which is available at https://www.ngdc.noaa.gov/mgg/greatlakes/superior.html.

Approach: To achieve a vintage map look, open a new print composer in QGIS and import a photo along with your map layers. For my design, I used bathymetric data to replicate a historic hydrographic map/survey using the contours of Lake Superior.

Anita Graser’s Blog Post on  Vintage Map Design Using QGIS suggests using Lost and Taken’s Gallery to find aged paper designs at high resolutions. The options for paper on this site are perfect for recreating a vintage style map.

Once your map and image of desired paper/texture are added as layers in your print composer, go to the rendering options for the map in the panel on the right of your screen. Change the option to multiply. You will now see in the window that the map elements have been blended with the paper texture.

With this function, the creator is able to recreate a vintage map of any desired spatial data.

The appropriate choice of text, labels, and other map elements are essential to replicate a vintage map. I found it helpful to reference existing historic maps online and try to replicate their design within QGIS. I imported a North Arrow, but all other elements are available in QGIS.

Animating Toronto Parking Enforcement with heatmap.js

by Justin Pierre – Geovis course project for SA8905, Fall 2015 (Dr. Rinner)

Heatmap.js is a project developed by Patrick Wied to create heatmaps online using JSON data and javascript. It’s lightweight, free to use and comes with tons of great customization options.

For my geovisualization project for SA8905 I created an animated heat map of parking tickets issued in Toronto during the 24 hour period of May 1st 2014. Parking ticket data is supplied on the Toronto Open Data Portal.

Thursday May 1st, 2014 was one of the busiest days of the year for parking tickets. There were 9,559 issued in 24 hours. 6am was the safest time with only 25 tickets issued and 9am was the busiest with 1,451.

To create the heatmap I  geocoded the Toronto parking ticket data using the city of Toronto street data with address ranges. About 10% of the records had to be manually geocoded to intersections, which was a time consuming process! Once I had the locations, it was simple to create a JSON object for each hour in excel, like this:

var h=[ {
 max: 100000,
 data: [
{lat: 43.667229, lng: -79.382666, count: 1},
{lat: 43.728744, lng: -79.30461, count: 1},
{lat: 43.778933, lng: -79.418283, count: 1},
{lat: 43.647378, lng: -79.418484, count: 1},

etc…

h is an array where each element is a JSON object containing the lats and lngs of each traffic ticket. The count is required for the heatmapping function and is always 1, unless you’re this driver:

Using heatmap.js is super straightforward. Initialize your web map in leaflet or openlayers (I used leaflet), configure some simple parameters:

var cfg = {
 "radius": .008,           //set for interpolation radius
 "maxOpacity": .8,         //set to .8 to show the basedata
 "scaleRadius": true,      //recalibrate radius for zoom
 "useLocalExtrema": true,  //reset data maximum based on view
 latField: 'lat',          //where is latitude referenced 
 lngField: 'lng',          //where is longitude referenced
 valueField: 'count'       //where is the numerical field
 };

Attach that to your heatmap object and point it at your datasource like so:

heatmapLayer = new HeatmapOverlay(cfg);
map.addLayer(heatmapLayer);
i=0;
heatmapLayer.setData(h[i]);

Remember that h[] is the array where the ticket data is stored and so h[0] is the first hour of data, midnight to 1am. This will create a static heatmap like this:

Screenshot

Now comes the part where we cycle through the hours of data with a setInterval() function:

setInterval(function(){ 
 i+=1;
 if (i>23) i=0;
 $( ".heatmap-canvas" ).fadeOut( "slow", function() 
   {heatmapLayer.setData(h[i]);
   heatmapLayer._draw();
   $( "#hour").html(i);
 });
 $( ".heatmap-canvas" ).fadeIn( "slow", function() {
 });
}, 2000);

Every 2,000 milliseconds (2 seconds) the page will fade out the heatmap layer, switch the data for the next hour and fade it back in. If the cycle has reached the end of the day it resets. The $( “#hour”).html(i) bit refers to changing the hour printed on the webpage itself.

You can check out the finished project at http://justinpierre.ca/tools/heatmap/ and be sure to let me know what you think at https://twitter.com/jpierre001.

T.Orientation: Colouring the Grids of Toronto

By Boris Gusev, Geovis Course Assignment, SA8905, Fall 2015 (Rinner)

 

The way in which we settle the land around us can paint a rich picture of how our cities have developed over years.  By the turn of the 19th century, urban planners generally agreed that grid-like patterns were the optimal solution and held the most promise for the future of transit. Physical planning led to the development of automotive cities like Los Angeles, Chicago and Detroit. Toronto’s history of growth can also be traced through its sprawling grid of roads.

In this visualization, a MapZen extract of OpenStreetMap road network was used to represent the compass-heading-based orientation of  Toronto roads. Streets that are orthogonal, meaning that they intersect at a right angle, are assigned the same colours. At a 90 degree angle, the streets are coloured with the darkest shades of orange or blue, decreasing in intensity as the intersection angle becomes more obtuse.

Follow the link to take a look at: Toronto Streets by Orientation

Vis_overview

More exciting details and a DIY guide under the cut. Kudos to Stephen Von Worley at Data Pointed for the inspiration and Mathieu Rajerison at Data & GIS Tips for the script and a great how-to.

Continue reading T.Orientation: Colouring the Grids of Toronto

HexBinning Ontario

By Andrew Thompson – Geovis course project, SA8905 (Dr. Rinner)

The power of data visualization is becoming increasingly more robust and intricate in nature. The demand to deliver a variety of complex information has lead to the development of highly responsive visual platforms. Libraries such as d3 are providing increased flexibility to work along multiple web technology stacks (HTML, CSS, SVG) allowing for nearly unlimited customization and capacity to handle large datatypes.

hexbin

In this development, a combination of d3 and Leaflet is used to provide a data-driven visualization within an easy to use mapping engine framework; made possible through the developments of Asymmetrik.  This collection of plugins, has allowed the creation of dynamic hexbin-based heatmaps and dynamically update/visualize transitions.

The web mapping application is avaiable at: HexBinning Ontario

Discussion of data & techniques follows below…

Continue reading HexBinning Ontario

The Cooling Effect of the 1991 Eruption of Mount Pinatubo, Philippines

By Clarisse Reyna

Geovis Course Assignment, SA8905, Fall 2015 (Rinner)

This is a time series map showing interpolated temperature change. Mount Pinatubo is located in the island of Luzon, Philippines. It erupted in 1991, which marked the second largest volcanic eruption in the 20th century. This caused a cooling effect as it released significant amounts of volcanic gases, aerosols and ash that increases albedo. This means that there is an increase in solar radiation being reflected, which decreases the amount of solar radiation reaching the troposphere and the surface. Since there is less solar radiation at the troposphere and the surface, it causes a temperature decrease. This is exactly what took place when Mount Pinatubo erupted in 1991. After the eruption, there was an observed surface cooling that took place in the Northern Hemisphere of around 0.5 to 0.6 degrees Celsius (Self et al. 1999).

In this time series map, interpolated temperatures in the Philippines from 1988 to 1995 is presented. What you should be able to see is that as time passes after the eruption (1991), there is a significant increase in blue areas which indicate lower temperatures. Originally, the years included would have been from 1985 to 1995. However, there are unusually low temperatures in 1987. In fact, the lowest ever recorded temperature in Manila was on February 4, 1987, with a temperature of 15.1 degrees Celsius. As you can see in the picture below, 1987 has large blue areas, indicating low temperatures. This may cause confusion when viewing the final time series visualization, so it was omitted from the final geovisualization project.

PrintScreen_TimeSlider3

The purpose of including temperatures before the eruption in 1991 is so that the viewer is able to see temperature trends before the cooling occurred. This allows viewers to compare temperature trends before the eruption to temperature trends after the eruption. The years included went up to 1995 because this was the last average temperature where it shows decreasing temperatures from 1991 in most of the cities.

The temperature data in this time series geovisualization were taken from a website called Weather Spark. The data taken from this source was yearly temperature averages from 1988 to 1995 in the Philippine cities of Aparri, Batangas, Bohol, Catarman, Coron, Manila, Davao, Lapu Lapu, Pasig, El Nido, Legazpi, and Pagudpud. Temperature data for the city of Boracay was not available so the province of Malay was used in place of it. Another province used was Bulacan. These areas are very spread apart in the Philippines. Therefore this gives a more accurate representation of temperature patterns during interpolation since the data points are spread apart and covers each part of the country. Lastly, the Philippine boundary shapefile was taken from a website called PhilGIS.

The technology used for this time series visualization was Time Slider, which is available in ArcMap (in versions ArcGIS 10.0 and up). For each year, the data taken from Weather Spark for each city or province was interpolated using the Inverse Distance Weighted method. Therefore, a raster was created for every year. Since there are eight years that are being included in this visualization, eight rasters were created. After creating an interpolation raster for each year, a raster catalog was created, and each of these rasters were added onto the raster catalog. After the rasters were added, time was enabled on the raster catalog layer.
PrintScreen_TimeSlider

When time is enabled on a layer, ArcMap allows you to use the Time Slider tool to create the time series visualization. This time slider tool allows you to preview what the time series visualization will look like. You can then export the time series visualization to an .avi file by clicking on the icon circled in red in the picture below.

PrintScreen_TimeSlider2

References

Country Boundary. (2013). In PhilGIS. Retrieved from http://philgis.org/freegisdata.htm

Historical Weather. In WeatherSpark Beta. Retrieved from https://weatherspark.com/

Self, S., Zhao, J., Holasek, R., Torres, R., & King, A. (1999). The Atmospheric Impact of the 1991 Mount Pinatubo Eruption. U.S. Geological Survey.

TTC Subway Stations and LRT Expansion Animated 1954-2021

An animated look at TTC’s subways and LRT expansion by when they first opened. Includes 2017’s Finch West subway expansion and 2021’s Eglinton LRT expansion.

By Khakan Zulfiquar – Geovis Course Assignment, SA8905, Fall 2015 (Rinner)

As a course assignment, we were required to develop a professional-quality geographic visualization product that uses novel mapping technology to present a topic of our interest. I chose to create an animated-interactive map using CartoDB to visualize the construction of Toronto Transit Commission (TTC) stations from years 1954 to 2021. The interactive map can be found at https://zzzkhakan.cartodb.com/.

khakan_2

Project Idea

This idea was inspired by Simon Rogers who animated the London’s Rail System. It was interesting to see an animated map slowly draw together a footprint of an entire infrastructure system. A number of (non-interactive) animations of Toronto’s subway system development were collected by Spacing magazine in 2007 and can be viewed at http://spacing.ca/toronto/2007/09/21/ttc-subway-growth-animation-contest/.

A feature within CartoDB called “torque” was used to create the envisioned map. Torque is ideal for mapping large number of points over time. Torque has been famously used in media for mapping tweets as pings.

Execution

As a beginner to CartoDB, I had to go through tutorials and online courses to get familiar with the interface. As I became comfortable with CartoDB and its features, I recalled an example I had seen in the CartoDB gallery. It was Simon Roger’s London Rail System map. I knew exactly the kind of data I would need to make a similar map for TTC stations. There was an instant halt as the data was not readily available. Using Wikipedia, ttc.ca, and OpenStreetMap I was able to compile the data I required. The data was uploaded into CartoDB and the following map was created.

khakan_1

Tutorial / How-to-Use

For the heading numbers above, please find the associated instructions below.

  1. Title and Subtitle – Speaks for itself.
  2. Toronto Subway/ LRT Map [full resolution]- A Map of Toronto’s Subway and future LRT produced by the TTC.  This map is the most common visual representation of TTC’s subways and LRT.  The map’s color scheme was mimicked to help viewers, especially those familiar with TTC, make the transition to the animated map smoothly.
  3. Timeline – The timeline is in a continuous-loop.  You can press pause to stop the animation and resume to start the animation again.  You can also control the speed of the animation by sliding the play-bar back-and-forth.
  4. Hover Window – As you hover over the stations, a window will pop up automatically with the name of the station.  No clicks required.  The names will only appear if the “ttc_station” layer is switched on (more on this in step 7).
  5. Info Window – If you would like further information on a certain station, simply click on the station and you will be presented with the station’s name, line #, grade (above, at, or underground), platform type, and etc. The info window will only appear if the “ttc_station” layer is switched on (more on this in step 7).
  6. Legend – as the name implies…
  7. Layer Switch – a tool to turn on or off the layers being used in this map.  The map was created with the intent to be both animated and interactive.  The animated bit is the stations being plotted and the interactive part was for the user to find further information about the station. However, the animated bit is both intrusive and resource-heavy.  Because of this, an option is being included to turn layers on-or-off as required. Be sure to try out the combinations.
  8. MAIN SHOW – the main map area has a beautiful CartoDB Dark Matter basemap with all of the TTC stations plotted. Feel free to zoom in and out.

Enjoy viewing and exploring.

Story Swipe Map – 2011 / 2015 Election Results

Geovis Course Assignment, SA8905, Fall 2015 (Rinner)
Author: Austin Pagotto
Link to Web app: http://arcg.is/1Yf8Yqn
(Note: project may have trouble loading using Chrome – try Internet Explorer)

Project Idea:

The idea of my project was to comprehensively map the past two Canadian federal election results. When looking for visualization methods to compare this data I came across the Swipe feature on the ArcGIS Online story maps. Along with all the interaction features of any ArcGIS online web map, this feature lets the user swipe left and right to reveal either different layers or in my case different maps. As you can see in the screenshot below the right side of the map is showing the provincial winners of the 2015 election while the left side of the map is showing the provincial winners of the 2011 election. The middle line in the middle can be swiped back and forth to show how the provincial winners differed in each election.

Pic1

Project Execution:

The biggest problem in executing my project was that the default ArcGIS online projection is web Mercator, which greatly distorts Canada. I was able to find documentation from Natural Resources Canada explaining how Lambert Conformal Conic basemaps can be uploaded to an ArcGIS online map and replace the default basemaps.

Another problem with my visualization of the project was that when zoomed to a national scale level, a lot of the individual polling divisions became impossible to see. This creates an issue because each polling division is designed to have a somewhat equal population count in them. So the small ones aren’t less important or less meaningful than the big ones. To solve this, when zoomed out, I changed the symbology to show the party that had won the most seats in each province, so it would show the provincial winner as seen in the previous screenshot. When zoomed in however the individual polling divisions become visible, showing the official name at increased zoom levels. The years of each election were added to the labels to help remind the user what map was on what side.
pic2

The methodology I used to create this project was to create two different online maps, one for each election year. Then I created the swipe web app which would allow both of these maps to be loaded and swipeable between the two. It was important here to make sure that all the settings for each map were the exact same (colors, transparency and attribute names).

The data that is shown on my maps were all downloaded from ArcGIS online to Arcmap Desktop and then zipped and reuploaded back to my project.  It was important to change my data’s projection to Lambert Conformal Conic before uploading it so that it wouldn’t have to be reprojected again using ArcGIS online.

This project demonstrated how web mapping applications can make visualizing and comparing data much easier than creating two standalone maps.

Data Sources: Projection/Basemap information from Natural Resources Canada
Election Data from ESRI Canada (downloaded from ArcGIS Online)

Link to Web app: http://arcg.is/1Yf8Yqn