Geovisual Project Assignment @RyersonGeo, SA8905, Fall 2022
Introduction
3D visualization is an essential and popular category in geovisualization. After a period of development, 3D printing technology has become readily available in people’s daily lives. As a result, 3D printable geovisualization project was relatively easy to implement at the individual level. Also, compared to electronic 3D models, the advantages of explaining physical 3D printed models are obvious when targeting non-professional users.
DEM Data to a 3D Surface: AccuTrans 3D – which provides translation of 3D geometry between the formats used by many 3D modeling programs.
Converting a 3D Surface to a Solid: Materialise Magics – Converting surface to a solid with thickness and the model is cut according to the boundaries of the 5 Transitional Regions of Ontario. Using different thicknesses representing the differences in total population between Transitional Regions. (e.g. The central region has a population of 5 million, and the thickness is 10 mm; the west region has a population of 4 million the thickness is 8 mm)
Slicing & Printing: This step is an indispensable step for 3D printing, but because of the wide variety of printer brands on the market, most of them have their own slicing software developed by the manufacturers, so the specific operation process varies. But there is one thing in common, after this step, the file will be transferred to the 3D printer, and what follows is a long wait.
Visualization
The 5 Transitional Regions is reorganized by the 14 Local Health Integration Network (LHIN), and the corresponding population and model heights (thicknesses) for each of the five regions of Ontario are:
West, clustering of: Erie-St. Clair, South West, Hamilton Niagara Haldimand Brant, Waterloo Wellington, has a total population of about 4 million, the thickness is 8mm.
Central, clustering of: Mississauga Halton, Central West, Central, North Simcoe Muskoka, has a total population of about 5 million, the thickness is 10mm.
Toronto, clustering of: Toronto Central, has a total population of about 1.4 million, the thickness is 2.8mm.
East, clustering of: Central East, South East, Champlain, has a total population of about 3.7 million, the thickness is 7.4mm.
North, clustering of: North West, North East, has a total population of about 1.6 million, the thickness is 3.2mm.
Limitations
The most unavoidable limitation of 3D printing is the accuracy of the printer itself. It is not only about the mechanical performance of the printer, but also about the materials used, the operating environment (temperature, UV intensity) and other external factors. The result of these factors is that the printed models do not match exactly, even though they are accurate on the computer. On the other hand, the 3D printed terrain can only represent variables that can be presented by unique values, such as the total population of my choice.
Geovis Project Assignment @RyersonGeo, SA8905, Fall 2022
Background
Toronto’s rapid transit system has been constantly growing throughout the decades. This transit system is managed by the Toronto Transit Commission (TTC) which has been operating since the 1920s. Since then, the TTC has reached several milestones in rapid transit development such as the creation of Toronto’s heavy rail subway system. Today, the TTC continues to grow through several new transit projects such as the planned extension of one of their existing subway lines as well as by partnering with Metrolinx for the implementation of two new light rail systems. With this addition, Toronto’s rapid transit system will have a wider network that spans all across the city.
Based on this, a geovisualization product will be created which will animate the history of Toronto’s rapid transit system and its development throughout the years. This post will provide a step-by-step tutorial on how the product was created as well as showing the final result at the end.
Geovis Project Assignment @RyersonGeo, SA8905, Fall 2021
Introduction/ background
Every city has zoning bylaws that dictate land use. Most cities, including the City of Toronto, have zoning bylaws that set building height limits for different zoning areas. Sometimes, buildings are built above the height limit, either due to development agreements or grandfathering of buildings (when a new zoning by-law doesn’t apply to existing buildings). The aim of this project is to provide a visualization tool for assessing which buildings in Toronto are within the zoning height limits and which are not.
Data and Processing
3D Buildings
The 3D building data was retrieved from Toronto Open Data and derived using the following methods:
LiDAR (2015)
Site Plans – building permit site plan drawings
Oblique Aerials – oblique aerial photos and “street view” photos accessible in Pictometry, Google Earth, and Google Maps.
3DMode – digital 3D model provided by the developer
Zoning Bylaws
Two zoning Bylaw shapefiles were used (retrieved from Toronto Open Data as well):
Building Heights Limits – spatially joined (buildings within zoning area) to the 3D buildings to create the symbology shown on the map. Categories were calculated using the max average building height (3D data) and zoning height limit (zoning bylaws).
Zoning Categories – used to gain additional information and investigate how or why buildings went over the zoning height limit.
Geovisualization
ArcGIS experience builder was used to visualize the data. A local scene with the relevant data was uploaded as a web scene and chosen as the data source for the interactive map in the “Experience”. The map includes the following aspects: Legend showing the zoning and height categories, a layer list allowing users to toggle the zoning category layer on to for further exploration of the data, and a “Filter by Height Category” tool that allows users to view buildings within a selected height category. Pop-ups are enabled for individual buildings and zones for additional information. Some zones include bylaw expectations which may explain why some of the buildings within them are allowed to be above the zoning height limit (only an exception code is provided, a google search is required to gain a better understanding). instructions and details about the map are provided to the user as well.
Limitations
The main limitation of this project is insufficient data – a lack of either building height or zoning height results in a category of “No data” which are displayed as grey buildings. Another limitation is possibly the accuracy of the data, as LiDAR data can sometimes be off and provide wrong estimates of building height. Inaccuracies within 1m were solved by adding an additional category, but there may be some inaccuracies beyond
GeoVis Project @RyersonGeo, SA8905, Fall 2021, Mirza Ammar Shahid
Introduction
Commercial real estate is crucial part of the economy and is a key indicator of a region’s economic health. In the project different types of Under constriction projects within the Toronto market will be assessed. Projects that are under construction or are proposed to be completed within the next few years will be visualized. Some property types that will be looked at are, hospitality, office, industrial, retail, sports and entertainment etc. The distribution of each property type within the regions will be displayed. To determine the proportional distribution within each region by property type. Software that will be used is Tableau to create a visualization of the data which will be interactive to explore different data filters.
Data
The data for the project was obtained from the Costar group’s database. The data used was exported using all properties within the submarket of Toronto (York region, Durham region, Peel Region, Halton region). Under construction or proposed properties above the size of 7000 sqft were exported to be used for the analysis. Property name, address, submarket, size, longitude, latitude and the year built were some of the attributes exported for each property project.
Method
Once data was filtered and exported from the source, the data was inserted into Tableau as an excel file.
The latitude and longitude were placed in rows and columns in order to create a map in tableau for visualization.
Density of mark was used to show the density and a filter was applied for property type.
Second sheet was created with same parameters but instead of density circle marks were used to identify locations of each individual project (Under Construction Projects).
Third sheet was created with property type on x axis and proportion of each in each region in y axis. To show the proportions of each property type by region.
The three worksheets were used to compile an interactive dashboard for optimal visualization of the data.
Results
The results are quite intriguing as to where certain property type constriction dominant over the rest. Flex is greatest in Peel region, Health care in Toronto, Hospitality in Halton, Industrial in Peel, Multifamily in Toronto, Office in downtown Toronto, retail in York region, specialty in York region and sports and entertainment in Durham with new casino opening in Ajax.
The final dashboard can be seen below, however due to sharing restrictions, the dashboard can only be accessed if you have a Tableau account.
Conclusion
In conclusion, using under construction commercial real estate dashboard can have positive impact on multiple entities within the sector. Developers can use such geo visualizations to monitor ongoing projects and find new projects within opportunity zones. Brokerages can use this to find new leads, potential listings and manage exiting listings. Governments of all three levels, municipal, provincial and federal can use these dashboard to monitor health conditions of their constituency and make insightful policy changes based on facts.
The inspiration behind creating this geovisualization project stems from my own curiosity about Toronto’s tourism industry and love of the hometown hockey team. There have been numerous instances where I found myself stressed and anxious about planning a stay within Toronto due to the overwhelming number of options for every element of my stay. I wanted to create content in an interactive manner that would reduce the scope of options in terms of accommodations, restaurants, and other attractions in a user-friendly way. With a focus on attending a Toronto Maple Leafs game, I have created an interactive map that presents readers with hotels, restaurants and other attractions that are highly reviewed, along with additional descriptions that may provide useful to those going to these places for the first time. Each of these locations are located under 1 kilometer from the Scotiabank Arena to ensure that patrons will not require extensive transportation and can walk from venue to venue. Also, the intent behind the interactive map is to increase fan engagement by helping fans find a sense of community within the selected places and ease potential stressors of planning their stay. For a Toronto Maple Leafs fan, the fan experience starts before the game even begins.
Why Story Map?
Esri’s Story Map was chosen to conduct this project because it is a free user-friendly method that allows anyone with an Esri Online account to create beautiful stories to share with the world. By creating a free platform, any individual or business can harness the benefits of content creation for their own personal pleasure or for their small business. Furthermore, the Shortlist layout was chosen to include images and descriptions about multiple locations for the Story Map to give readers visual cues of the locations being suggested. The major goal behind using this technology is to ensure that individuals in any capacity can access and utilize this platform by making it accessible and easy to understand.
Data
To obtain the data for the specific locations of the hotels, restaurants, and other attractions, I inspected various travel websites for their top 10 recommendations. From these recommendations, I selected commonalities among the sites and included other highly recommended venues to incorporate diversity among the selection. For the selected hotels, I attempted to include various category levels to accommodate different budgets of those attending the Leafs game. Additionally, all attractions chosen do require an additional purchase of tickets or admission, but vary in price point as well.
Creating Your Story Map
Start the Story Map Shortlist Builder using a free ArcGIS public account on ArcGIS Online.
Create a title for your interactive map under the “What do you want to call your Shortlist?”. Try to be as creative, but concise, as possible!
The main screen will now appear. You can now see your title on the top left, as well as a subtitle and tabs below. To the right, there is a map that you can alter as you like. To add a place, click the “Add” button within the tab frame. This will allow you to create new places that you want to further describe.
A panel will appear where you can enter the name of the chosen destination, provide a picture, include text, and specify its location. You can include multiple images per tab using the “Import” feature. Once the location has been specified using the venue’s address, a marker will appear on the map. You are able to click and drag this marker to any destination that you choose. The colour of the marker correlates to the colour of the tab. Additionally, you can include links within the description area to redirect readers to the respective venue’s website.
Click the “+” button on the top right hand corner of the left side panel to add more destinations. The places that you add will show as thumbnails on the left side of the screen. Click the “Organize” button underneath the tab to reorder the places. You can order these in any way that seems logical for your project. Click “Done” when satisfied.
To create multiple tabs, click the “Add Tab” button. To edit a tab, click the “Edit Tab” button. This will allow you to change the colour of the tab and its title.
To save your work, press the “Save” button occasionally, so all of your hard work is preserved.
There are also optional elements that you can include as well. You can change the behaviour and appearance of your Shortlist by clicking the “Settings” button. You are able to change the various functions people can utilize on the map. This includes implementing a “Location Button” and “Feature Finder” where readers can see their own location on the map and find specific locations on the map, respectively. You are also able change the colour scheme and header information by clicking on their tab options. Hit “Apply” when satisfied.
To share your Shortlist click the “Save” button and then click the “Share” button. You can share publicly or just within your organization. Additionally, you can share using a url link or even embed the Story Map within a website.
Limitations & Future Work
The main limitation of this project was selecting what venues to include. Toronto is a lively city with an overwhelming amount of options for visitors to choose from, resulting in many places being overlooked or unaccounted for. Overall, the businesses chosen represent a standard set of places for those who are unfamiliar with the city. To include a more diverse set of offerings, an addition to the current project, or an entirely new project, can be created to include places that provide more niche products/services. Furthermore, a large portion of the venues chosen were selected from travel/tourism advisory websites where the businesses on the sites may pay a fee to be included, thus limiting the amount of exposure other businesses may have.
Overall Thoughts
Story Map was simple to understand and the platform was aesthetically pleasing. My only reservations about this program is the limited amount of stylization control in terms of the text and other design elements. I would most likely use this platform again, but may attempt to find a technology that allows for more control over the overall appearance and settings of the geovisualization.
Over the course of the pandemic, the City of Toronto has implemented a COVID-19 webpage focused on providing summary statistics on the current extent of COVID-19 cases in the city. Since the beginning of the pandemic, this webpage has greatly improved, yet it still lacks the functionality to analyze spatio-temporal trends in case counts. Despite not providing this functionality directly, the City has released the raw data for each reported case of COVID-19 since the beginning of the pandemic . Using RStudio with the leaflet and shiny libraries, a tool was designed to allow for the automated collection, cleaning and mapping of this raw case data.
DATA
The raw case data was downloaded from the Toronto Open Data Portal in R, and added to a data frame using read.csv. As shown in the image below, this data contained the neighbourhood name and episode date for each individual reported case. As of Nov. 30th, 2020, this contained over 38,000 reported cases. Geometries and 2016 population counts for the City of Toronto neighbourhoods were also gathered from the Toronto Open Data Portal.
PREPARING THE DATA
After gathering the necessary inputs, an extensive amount of cleaning was required to allow the case data to be aggregated to Toronto’s 140 neighbourhoods and this process had to be repeatable for each new instance of the COVID-19 case data that was downloaded. Hyphens, spaces and other minor inconsistencies between the case and neighbourhood data were solved. Approximately 2.5% of all covid cases in this dataset were also missing a neighbourhood name to join on. Instead of discarding these cases, a ‘Missing cases’ neighbourhood was developed to hold them. The number of cases for each neighbourhood by day was then counted and transposed into a new data table. From there, using ‘rowSum’, the cumulative number of cases in each neighbourhood was obtained.
Unfortunately, in its current state, the R code will only gather the most recent case data and calculate cumulative cases by neighbourhood. Based on how the data was restructured, calculating cumulative cases for each day since the beginning of the pandemic was not achieved.
CREATING A SHINY APP USING LEAFLET
Using leaflet all this data was brought together into an interactive map. Raw case counts were rated per 100,000 and classified into quintiles. The two screenshots below show the output and popup functionality added to the leaflet map.
In its current state, the map is only produced on a local instance and requires RStudio to run. A number of challenges were faced when attempting to deploy this map application, and unfortunately, the map was not able to be hosted through the shiny apps cloud-server. As an alternative, the map code has been made available through a GitHub repository at the top of this blog post. This repository also includes a stand-alone HTML file with an interactive map.
LIMITATIONS
There are a couple notable limitations to mention considering the data and methods used in this project. For one, the case data only supports aggregation to Toronto neighbourhoods or forward sortation areas (FSA). At this spatial scale, trends in case counts are summarized over very large areas and are not likely to accurately represent This includes the modifiable areal unit problem (MAUP), which describes the statistical biases that can emerge from aggregating real-world phenomena into arbitrary boundaries. The reported cases derived from Toronto Public Health (TPH) are likely subject to sampling bias and do not provide a complete record of the pandemic’s spread through Toronto. Among these limitations, I must also mention my limited experience building maps in R and deploying them onto the Shinyapps.io format.
FUTURE GOALS
With the power of R and its many libraries, there are a great many improvements to be made to this tool but I will note a few of the significant updates I would like to implement over the coming months. Foremost, is to use the ‘leaftime’ R package to add a timeline function, allowing map-users to analyze changes over time in reported neighbourhood cases. Adding the function to quickly extract the map’s data into a CSV file, directly from the map’s interface, is another immediate goal for this tool. This CSV could contain a snapshot of the data based on a particular time frame identified by a user. The last functionality planned for this map is the ability to modify the classification method used. Currently, the neighbourhoods are classified into quintiles based on cumulative case counts per 100,000. Using an extended library of leaflet, called ‘leafletproxy’, would allow map users greater control over map elements. It should be possible to allow users to define the number of classes and which method (i.e. natural breaks, standard deviation, etc.) directly from the map application.
Banking in the 21st century has evolved significantly especially in the hyper competitive Canadian Market. Big banks nationally have a limited population and wealth share to capture given Canada’s small population and have been active in innovating their retail footprint. In this case study, TD Bank is the point of interest given its large branch network footprint in the Toronto CMA. Within the City of Toronto the bank has 144 branches and is used as the study area for the dashboard created. The dashboard analyzes the market potential, branch network distribution, banking product recommendations and client insights to help derive analytics through a centralized and interactive data visualization tool.
Technology
The technology selected for the geovisualization component is Tableau given its friendly user interface, mapping capabilities, data manipulation and an overall excellent visualization experience. However, Alteryx was widely used for the build out of the datasets that run in Tableau. As the data was extracted from various different sources, spatial element and combining datasets was all done in Alteryx. The data extracted for Expenditure, Income and Dwelling Composition was merged and indexed in Alteryx. The TD Branches was web scrapped live from the Branch Locator and the trading areas (1.5KM Buffers) are also created in Alteryx. The software is also used for all the statistical functions such as the indexed data points in the workbook were all created in Alteryx. The geovisualization component is all created within the Tableau workbooks as multiple sheets are leverged to create an interactive dashboard for full end user development and results.
Data Overview
There are several data sets used to build the multiple sheets in the tableau workbook which range from Environics Expenditure Data, Census Data and webscrapped TD branch locations. In addition to these data sets, a client and trade area geography file was also created. The clients dataset was generated by leveraging a random name and Toronto address generator and those clients were then profiled to their corresponding market. The data collected ranges from a wide variety of sources and geographic extents to provide a fully functional view of the banking industry. This begins by extracting and analyzing the TD Branches and their respective trade areas. The trading areas are created based on a limited buffer representing the immediate market opportunity for the respective branches. Average Income and Dwelling composition variables are then used at the Dissemination Area (DA) geography from the 2016 Census. Although income is represented as an actual dollar value, all market demographics are analyzed and indexed against Toronto CMA averages. As such these datasets combined with Market, Client and TD level data provide the full conceptual framework for this dashboard.
Tables & Visualization Overview
Given the structure of the datasets, six total tables are utilized to combine and work with the data to provide the appropriate visualization. The first two tables are the branch level datasets which begin with the geographic location of the branches in the City of Toronto. This is a point file taken from the TD store locator with fundamental information about the branch name and location attributes. There is a second table created which analyzes the performance of these branches in respect to their client acquisition over a pre-determined timeframe.
The third table used consists of client
level information selected from ‘frequent’ clients (clients transacting at
branches 20+ times in a year. Their information builds on the respective
geography and identifies who and where the client resides along with critical
information that is usable for the bank to run some level of statistical
analytics. The client table shows the exact location of those frequent clients,
their names, unique identifiers, their preferred branch, current location,
average incomes, property/dwelling value and mortgage payments the bank
collects. This table is then combined to understand the client demographic and
wealth opportunity from these frequent clients at the respective branches.
Table four and five are extremely comprehensive
as they visualize the geography of the market (City of Toronto at a DA level).
This provides a trade area market level full breakdown of the demographics and
trading areas as DAs are attributed to their closest branch and allows users to
trigger on for where the bank has market coverage and where the gaps reside.
However, outside of the allocation of the branches, the geography has a robust
set of demographics such as growth (population, income), Dwelling composition
and structure, average expenditure and the product recommendations the bank can
target driven through the average expenditure datasets. Although the file has a
significant amount of data and can be seen as overwhelming, selected data is
fully visualized. This also has the full breakdown of how many frequent clients
reside in the respective markets and what kind of products are being recomened
on the basis of the market demographics analyzed through dwelling composition,
growth metrics and expenditure.
The final table provides visualization and
breakdown of the five primary product lines of business the bank offers which
are combined with the market level data and cross validated against the average
expenditure dataset. This is done to identify what products can be recommended throughout
the market based on current and anticipate expenditure and growth metrics. For example,
markets with high population, income and dwelling growth with limited spend
would be targeted with mortgage products given the anticipated growth and the
limited spend indicating a demographic saving to buy their home in a growth
market. These assumptions are made across the market based on the actual
indexed values and as such every market (DA) is given a product recommendation.
Dashboard
Based on the full breakdown of the data extracted, the build out and the tables leveraged as seen above, the dashboard is fully interactive and driven by one prime parameters which controls all elements of the dashboard. Additional visualizations such as the products visualization, the client distribution treemap and the branch trends bar graph are combined here. The products visualization provides a full breakdown of the products that can be recommended based on their value and categorization to the bank. The value is driven based on the revenue the product can bring as investment products drive higher returns than liabilities. This is then broken down into three graphs consisting of the amount of times the product is recommended, the market coverage the recommendation provides between Stocks, Mortgages, Broker Fees, Insurance and Personal Banking products. The client distribution tree map provides an overview by branch as to how many frequent clients reside in the branch’s respective trading area. This provides a holistic approach to anticipating branch traffic trends and capacity constraints as branches with a high degree of frequent clients would require larger square footage and staffing models to adequately service the dependent markets. The final component is the representation of the client trends in a five year run rate to identify the growth the bank experienced in the market and at a branch level through new client acquisition. This provides a full run down of the number of new clients acquired and how the performance varies year over year to identify areas of high and low growth.
This combined with the primary three mapping visualizations, creates a fully robust and interactive dashboard for the user. Parameters are heavily used and are built on a select by branch basis to dynamically change all 6 live elements to represent what the user input requires. This is one of the most significant capabilities of Tableau, the flexibility of using a parameter to analyze the entire market, one branch at a time or to analyze markets without a branch is extremely powerful in deriving insights and analytics. The overall dashboard then zooms in/out as required when a specific branch is selected highlighting its location, its respective frequent clients, the trade area breakdown, what kind of products to recommend, the branch client acquisition trends and the actual number of frequent clients in the market. This can also be expanded to analyze multiple branches or larger markets overall if the functionality is required. Overall, the capacity of the dashboard consists of the following elements:
1. Market DA Level Map 2. Branch Level Map 3. Client Level Map 4. Client Distribution (Tree-Map) 5. Branch Trending Graph 6. Product Recommendation Coverage, Value and Effectiveness
This combined with the capacity to manipulate/store a live feed of data and the current parameters used for this level of analysis bring a new capacity to visualizing large datasets and providing a robust interactive playground to derive insights and analytics.
Geovis Project Assignment @RyersonGeo, SA8905, Fall 2019
Background/Introduction
The City of Toronto Police Services have been keeping track of and stores historical crime information by location and time across the City of Toronto since 2014. This data is now downloadable in Excel and spatial shapefiles by the public and can be used to help forecast future crime locations and time. I have decided to use a set of data from the Police Services Data Portal to create a time series map to show crime density throughout the years 2014 to 2018. The data I have decided to work with are auto-theft, break and enter, robbery, theft and assault. The main idea of the video map I want to display is to show multiple heat density maps across month long intervals between 2014 to 2018 in the City of Toronto and focus on downtown Toronto as most crimes happen within the heart of Toronto.
The end result is an animation time-series map that shows density heat map snapshots during the 4-year period, 3-month interval at a time. Examples of my post are shown at the end of this blog post under Heat Map Videos.
Dataset
All datasets were downloaded through the Toronto Police Services
Data Portal which is accessible to the public.
The data that was used to create my maps are:
Assault
Auto Theft
Robbery
Break and Enter
Theft
Process Required to Generate Time-Series Animation Heat Maps
Step 1: Create an additional field to store the date interval in ArcGis Pro.
Add the shapefile downloaded from the Toronto Police Services Portal intoArcGIS Pro.
First create a new field under View Table and then click on Add.
To get only the date, we use the Calculate Field in the Geoprocessing tools with the formula
date2=!occurrence![:10]
where Occurrence is the existing text field that contains the 10 digit date: YYYY-MM-DD. This removes the time of day which is unnecessary for our analysis.
Step 2: Create a layer using the new date field created.
Go into properties in the edited layer. Under the time tab, place in the new date field created from Step 1 and enter in the time extent of the dataset. In this case, it will be from 2014-01-01 to 2018-12-31 as the data is between 2014 to 2018.
Step 3: Create Symbology as Heat Map
Go into the Symbology properties for the edited layer and select heat map under the drop down menu. Select 80 as its radius which will show the size of the density concentration in a heat map. Choose a color scheme and set the method as Dynamic. The method used will show how each color in the scheme relates to a density value. In a Dynamic setting versus and constant, the density is recalculated each time the map scale or map extent changes to reflect only those features that are currently in view. The Dynamic method is useful to view the distribution of data in a particular area, but is not valid for comparing different areas across a map (ArcGIS Pro Help Online).
Step 4: Convert Map to 3D global scene.
Go to View tab on the top and select convert to global scene.
This will allow the user to create a 3D map feature when showing their animated
heat map.
Step 5: Creating the 3D look.
Once a 3D scene is set, press and hold the middle mouse button and drag it down or up to create a 3D effect.
Step 6: Setting the time-series map.
Under the Time tab, set the start time and end time to create the 3 month interval snapshot. Ensure that “Use Time Span” is checked and the Start and End date is set between 2014 and 2018. See the image below for settings.
Step 7: Create a time Slider Steps for Animation Purposes
Under Animation tab, select the appropriate “Append Time” (the transition time between each frame). Usually 1 second is good enough, anything higher will be too slow. Make sure to check off maintain speed and append front before Importing the time Slider Steps. See below image.
Step 8: Editing additional cosmetics onto the animation.
Once the animation is created, you may add any additional
layers to the frames such as Titles, Time Bar and Paragraphs.
There is a drop down section in the Animation tab that will
allow you to add these cosmetic layers onto the frame.
Animation Timeline by frames will look like this below.
Step 9: Exporting to Video
There are many types of exports the user can choose to create. Such as Youtube, Vimeo, Twitter, Instagram, HD1080 and Gif. See below image for the settings to export the create animation video. You can also choose the number of frames per second, as this is a time-series snapshot no more than 30 frames per second is needed. Choose a place where you would like to export the video and lastly, click on Export.
Conclusion/Recommendation/Limitation
As this was one of my first-time using ArcGIS Pro software, I find it very intuitive to learn as all the functions were easy to find and ready to use. I got lucky in finding a dataset that I didn’t have to format too much as the main fields I required were already there and the only thing required was editing the date format. The number of data in the dataset was sufficient for me to create a time series map that shows enough data across the city of Toronto spanning 3 months at a time. If there was less data, I would have to increase my time span. The 3D scene on ArcGIS Pro is very slow and created a lot of problems for me when trying to load my video onto set time frames. As a result of the high-quality 3D setting, I decided to use, it took couple of hours to render my video through the export tool. As the ArcGIS Pro software wasn’t made to create videos, I felt that there was lack of user video modification tools.
Heat Map Videos Export
Theft in Downtown Toronto between 2014-2018. A Time-Series Heat Map Animation using a 3 month Interval.
Robbery in Downtown Toronto between 2014-2018. A Time-Series Heat Map Animation using a 3 month Interval.
Break and Enter in Downtown Toronto between 2014-2018. A Time-Series Heat Map Animation using a 3 month Interval.
Auto Theft across the City of Toronto between 2014-2018. A Time-Series Heat Map Animation using a 3 month Interval.
Assault across the City of Toronto between 2014-2018. A Time-Series Heat Map Animation using a 3 month Interval.
Geovis Project Assignment @RyersonGeo, SA8905, Fall 2019
CARTO is an online tool to create online maps, dashboards, and perform spatial analysis. Basic membership is free and no coding experience is required to get your maps online. I creating my project on visualizing Toronto Fire Service data entirely in CARTO. The embedded map is below or you can click here to see it in a new tab.
I’ll briefly explain how I created my map and how you can too.
Before we get to CARTO, we’ll need our data. The City of Toronto’s Open Data portal contains lots of free data on city services and life. From the portal I downloaded shapefiles of TFS stations and run areas (catchment areas for fire stations), and a CSV file of fire incidents.
Next create a CARTO account if you don’t already have one. Once logged in, the CARTO home page will have links to “Getting Started”, “New Map”, and “New dataset.” The Getting Started page is an excellent tutorial on CARTO for first time users.
Before we start making a map, we will need to upload our data. Click “new dataset” and follow the prompts. Note, CARTO requires shapefiles to be archived in a ZIP file.
Once that is done, click on “new map” and add your uploaded datasets. CARTO will add your datasets as layers to the map, zoom to layer extent, and automatically create a point layer out of the CSV file.
The map is on the right side of the screen and a control panel with a list of the uploaded layers is on the right. From here we can do a couple of things;
Re-title our map by double clicking on the default title
Rearrange our layers by dragging and dropping. Layer order determines drawing order. Rearrange the layers so that the stations and incidents points are on top of the run area polygon.
Change the base map. I’ve used Positon Lite for a simple and clean look. Note, CARTO has options to import base maps and styles from other site, or to create your own.
Click on the layer card to bring up that layer’s options menu.
Let’s click on the fire stations layer. As with the map we can rename the layer by double clicking on the name. The layer menu has five panes, Data, Analysis, Style, Pop-Up, Legend. The Style pane will be selected by default. The first section of the Style pane is aggregation, which is useful for visualizing dense point layers. We’ll keep the default aggregation of By Point. Section 2 Style controls the appearance of the layer. I’ve changed my point colour to black and increased the size to 12. I need the stations to stand out from the incident points.
Now with the incidents layer, I decided to use the Animation aggregation option. If the point layer has a column representing time, we can use this to create an animation of the points appearing on the map over time. This option creates a time slider widget at the bottom of the map with a histogram representing the amount of fires over time.
With the run areas, I decided to create a choropleth map where run areas with higher amount of incidents would appear darker on the map. To do this, I first need to determine how many incidents points fall into each run area. Go to the run area menu, click on Analysis, then “+Add New Analysis.” CARTO will navigate to a new page with a grid of its spatial analysis options. Click on “Intersect and Aggregate” which finds “overlapping geometries from a second layer and aggregate its values in the current layer.”
CARTO will navigate back to the Analysis pane of the run area menu and display options for the analysis. Run area should already be selected under Base Layer. Choose incidents as the target layer, and under Measure By select count. CARTO will display a message stating new columns have been added to the data, count_vals and count_vals_density.
There will be an option to style the analysis. Click on it. Choose “by value” for Polygon Colour, and choose the new count_vals_density for Column, then select an appropriate colour scheme.
CARTO’s widget feature creates small boxes on the right of the map with useful charts and stats on our data. You click on the Widgets pane to start add new widgets from a grid (as with Analysis) or can add new widgets based on a specific layer from that layer’s Data pane. CARTO has four types of widgets;
Category creates a horizontal bar chart measuring how many features fit into a category. This widget also allows users to filter data on the map by category.
Histogram creates a histogram measuring a selected variable
Formula displays a statistic on the data based on a selected formula
Time Series animates a layers according to its time information.
As with layers, clicking on a widget brings up its option menu. From here you can change the source data layer, the widget type, and configure data values. For my Fires by Run Area widget, I used the incidents layer as the source, aggregated by id_station (fire station ID numbers) using the count operation. This widget counts how many incidents each station responded to and displays a bar chart of the top 5 stations. Clicking on a station in the bar chart will filter the incidents by the associated station. After this, I added four formula based widgets.
We’re nearly done. Click on the “publish” button on the bottom left to publish the map to the web. CARTO will provide a link for other users to see the map and an HTML embed code to add it to a web page. I used the embed code to added the embedded map to the beginning of the post.
Thanks for reading. I hope you’ll use CARTO to create some nice maps of your own. You may be interested in checking out the CARTO blog to see other projects built on the platform or the Help section for my information on building your own maps and applications.
Geovis Project Assignment @RyersonGeo, SA8905, Fall 2019
By: Julia DiMartella-Orsi
Introduction:
ESRI’s creation of the Story Map changed the way we could visualize data. Not only did it allow for a broader audience to interact and create their own maps due to its easy to use design, it also contained many new amazing functions, templates, and themes. Users can personalize their story by adding in their own images, text, videos, and map layers by creating their own free ArcGIS Online account. Popular templates include Map Series, Tour, Journal, and Cascade.
Once you have selected the template you want to use the choice is up to you. By clicking the “+” symbol you can choose to include text, media sources such as a videos, a new title page, or immersive content such as a web map.
ESRI also designed Story Maps to link to
outside content and various social media sites such as Flickr and Unsplash. ‘Link
to Content’ is also extremely useful as it allows users to add photos and videos
found on the internet directly to their story map by copying and pasting their
link.
To add interactive web maps into your story map users can link map layers from their ArcGIS Online account. Layers can be created in ArcGIS Online, but also in ArcMap where layers are exported as a zip file and imported onto your ArcGIS Online base map. Map layers can also be found online using the ‘add layer from the web’ or ‘search for layers’ options. The layers that appear are based on the type of ArcGIS Online account you have created. Enterprise accounts contain additional layers provided by your organization, however ESRI also has free downloadable layers available for users without an organization.
Users also have the option to make their story maps public by clicking the globe icon, or private for their own personal use by clicking the lock icon. To save your story map select the floppy disk icon. Your saved map will appear under ‘My Content’ in your ArcGIS Online account.
My Story and Creating Web Maps:
Over the last few years, theft in
Toronto has been increasing at a rapid rate. According to the Toronto Police
Service, Toronto experienced a total of 5430 thefts between 2014-2018. However,
these are only those that have been reported and documented by police. In order
to analyze the distribution of theft across the city, the Toronto Police created
a point dataset that summarized when and where each theft took place. Additional
datasets were also created for prominent types of theft such as bicycle and
auto theft.
To compare the number and types of theft in each Toronto neighbourhood I decided to create a story map using the Cascade template. This created a scrolling narrative that would allow viewers to observe the data in a clear, unique way. The reason why I chose to use a story map was due to the number of layers I wanted to compare, as well as use the ‘swipe tool’ to easily compare each neighbourhood. Therefore, I created a series of choropleth maps based on the 2014-2018 theft/crime data from the Toronto Police Open Data Portal.
The following steps were used to create each web
map used in my Story Map:
Step 1: Download the point data and add the
layer into ArcMap.
Step 2: Use the ‘spatial join’ analysis tool
and select your neighbourhood boundary file as the target layer and the theft
point data as the join feature. Make sure to select ‘join one to one’. This
will produce a new layer with a ‘count’ field that counts the number of thefts
in each neighbourhood – each neighbourhood is given a count.
Step 3: In order to produce accurate results, you must normalize your data. To do so add a new field into your attribute table (same layer with the count field) titled ‘Area’, and right click to select ‘calculate geometry’. Change the property to ‘area’ and choose the units you wish to use. Click ‘ok’ and the results will populate your new field.
Step 5: Export the layer and save it as a compressed zip folder. Import the data into ArcGIS Online by clicking the “Add” tab.
Step 6: Once you import your layer you
are given a variety of styles to choose from. Select the one you like best (ex:
choropleth) as well as the field you wish to map – in this case select ‘count’.
To normalize ‘count’ select the ‘divided by’ dropdown and choose your ‘Area’ field.
Change the colour of your map to your preference by clicking ‘symbols’.
Step 7: Save your layer to and select the tags that relate to your topic. The layer will now appear in ‘My Content’ where it can be added to your Story Map.
Step 8: To compare each layer add both layers you wish to compare to your story map by using the “+” symbol. Once you have done so, choose the transition type (ex: horizontal swipe) you want to use by clicking on the arrow below. The transition will take place as the user scrolls through your story map.
My Story Map titled “Toronto Theft: A
Neighbourhood Investigation” can be viewed here: