Saturday, December 27, 2008

Iphone application development

I have recently been researching how we can implement our application on the mobile device. Since we are using an Iphone for a testbed it has become an interesting problem on how to use OpenGL and Objective-C to transfer our program to this device.

I have found some great resources related to these problems and some sample code that should be of great use to us. Below are some links that I am posting as a reference later on when I will actually implement the program on the mobile device.

/* This link is sample code for GLSprite. The GLSprite sample application shows how to create a texture from an image. By looking at the code, you can learn how to use Core Graphics to create a bitmap context and draw an image into the context. You can then see how to use OpenGL ES to create a texture from the image data.

This application is built on the Cocoa Touch OpenGL Application template. Instead of using GL_COLOR_ARRAY as provided in the template, GLSprite renders a texture. The textured sprite in the application rotates using the timer that's provided with the template. */

http://developer.apple.com/iphone/library/samplecode/GLSprite/index.html

/* XMLPerformance -- This sample explores two approaches to parsing XML, focusing on performance with respect to speed, memory footprint, and user experience. The XML data used is the current "Top 300" songs from the iTunes store. The data itself is not particularly important to the sample - it was chosen because of its simplicity, availability, and because the size (approximately 850KB) is sufficient to demonstrate the performance issues central to the sample.

This could be very useful if we can find a way to export the database in a XML format which would be easy to transfer, as well as being less data intensive and non-chatty */

http://developer.apple.com/iphone/library/samplecode/XMLPerformance/index.html

/* LocateMe -- Shows how to use the CLLocationManager class to determine the user's current location. It demonstrates starting and stopping updates, error handling, and changing location parameters.

This can be used to determine a users location that will enable us to provide location based content for the user. Can also be used for tracking users and other possible applications. */

http://developer.apple.com/iphone/library/samplecode/LocateMe/index.html

/*UICatalog -- This sample is a catalog of all the UI elements found in the iPhone's UIKit framework. It is designed to exhibit a large variety of views and controls along with all their varying properties and styles. If you need code to create specific UI controls or views, refer to this sample and it should give you a good head start in building your user interface. In most cases you can simply copy and paste the code snippets you need. */

http://developer.apple.com/iphone/library/samplecode/UICatalog/index.html

/*SeismicXML -- The SeismicXML sample application demonstrates how to use NSXMLParser to parse XML documents. When you launch the application it fetches and parses an RSS feed from the USGS that provides data on recent earthquakes around the world. It displays the location, date, and magnitude of each earthquake, along with a color-coded graphic that indicates the severity of the earthquake. */

http://developer.apple.com/iphone/library/samplecode/SeismicXML/index.html


There are also many other sample apps available that should prove very useful in the development of the NIJ mobile device implementation. These other sample source code and program can be found at:

http://developer.apple.com/iphone/library/navigation/SampleCode.html


Thursday, December 11, 2008

NIJ Project Conceptual Architecture and Data Flow Models

This figure is the conceptual project architecture for the NIJ Project.  This image was adapted from the orignal project proposal by Dr. Jiyeong Lee.  It describes in detail all of the aspects of the project and how data will be inputted and interfaces used.





The next image represents the flow of data from original building CAD file to the final mobile and desktop interfaces.





Mobile Application Implementation

Today I have finished composing a document that details the functionality and goals of the mobile application for NIJ project.  I will include a few of the key details in this post.

Mobile Application Goal

Implement a mobile application to support the operation of a GIS-based intelligent emergency response system (GIERS) (Kwan and Lee, 2005) that uses a 3D geocoding technology for locations in large buildings.  The application will aim to facilitate quick response to emergencies on multi-level structures using existing network technologies.  The mobile application will be a key component in the overall project in order to further the gathering of real-time data to aid in the decision making process.  This will allow emergency managers to respond to emergencies accurately, and evaluate and implement emergency response plans accordingly. 

Along with providing remote data gathering capabilities, the mobile application will serve as a secondary means of communication between dispatchers and emergency responders.  By using informative visualizations we hope to provide a means to guide responders through complex urban structures to points of interest  within the building.  In combination with our desktop application, the mobile application will provide a comprehensive set of tools to reduce the amount of time required to respond to emergencies in large and complex structures.

Functionality

·         Device

o   Provide a UI that is fully integrated for mobile use

o   Use network connections such as Wi-Fi or 3G

o   Effectively visualize key data

o   Useful in an emergency situation

o   Use GPS or cell phone triangulation to locate users

·         Application

o   Allow connection to Postgresql database or web interface

o   Allow editing and viewing of certain data

o   Send/receive data efficiently

o   Display real time information from data source

o   Allow users to easily identify/flag critical information (blocked hallways, stairways, room destinations)

o   Intuitive and easy to use

o   Allow multiple user types with different levels of access

o   Allow self-location of mobile users using GPS or triangulation

o   Allow reporting of different emergency types (fires, flood, chemical spill, etc.)

o   Provide interactive navigational guidance to critical locations 

Sunday, December 7, 2008

Project Update

For the next week or so I will be working on figuring out the conceptual framework and goals of our mobile application. I have been reading through the relevant research articles related to our study and have been finding out different ways to implement a solution.

As mentioned in a previous post, we will most likely be creating some kind of web interface that we will use to manipulate the data sets and allow users to edit things in the environment.

For this week I will be composing a document related to this which will be distributed about to be modified by the PI's and other researchers. It will later be distributed to the researchers at Purdue and other relevant people in hopes that it may generate new ideas on the mobile app implementation.

Wednesday, December 3, 2008

iPhone app implementation

I have been trying to find the best way to implement our application on the iPhone and I have run into a few problems and a variety of different solutions. The main problem with connecting to the database server can be solved by installed an app already written for the iPhone, Mobile Can 2.0. Another option is to install an SSH client to make the connection, and there and several of those are available: iSSH, pTerm, and TouchTerm. Each of these programs are available from the App store for a fee. One large problem arises with using these is that only one application can run at a time on the iPhone unless special software is installed to allow this which requires you to "jailbreak" the iPhone in order to install this app which makes changes to a critical config file. This would not let me take advantage of this software and allow me to "listen" to the outputs of these programs in an external program.

Since this seems to not be a very viable solution, I have trying to find other methods for obtaining data remotely from our database. Another option that I have discovered is a program called phpPgAdmin which is a web based postgresql database administration tool. This would essentially make the database available for updates and modifications just by visiting a web address. This would be much easier to use in the iPhone since no other additional apps would be required to install, and updates (for example blocking a hallway) could be executed by constructing SQL queries and sending them through the iPhone's web interface to the database server. However, this would require that the database server also be a web server with PHP enabled. This means that we would have to get Scott to install the software, as well as enable Apache and PHP for CCID147.

The great thing about using the phpPgAdmin web interface is that any mobile device with access to the internet can make updates to the database, not just the iPhone. This means that any program we write for the iPhone can be ported to another mobile OS (ie. Windows Mobile 6, Android) relatively easy without having to use proprietary iPhone software.

If this is not possible there are other routes that we can take such as creating a REST API to send/receive data. This would also allow for interoperability across multiple platforms other than the iPhone.

I am awaiting an email on whether to proceed with the phpPgAdmin installation.


Iphone Programming resources

I have been tasked with creating an extension of our program to be used on the iPhone. I have looking around the web for tutorials and explanations on how to program using Cocoa Touch, Objective-C, and other tools. Here are some useful websites that I have discovered related to these topics:

CS193P - Stanford iPhone programming class lecture notes and other information
http://www.stanford.edu/class/cs193p/cgi-bin/index.php

iPhone development blog (3rdparty)
http://icodeblog.com/

Apple iPhone Dev Center
http://developer.apple.com/iphone/

Getting started video
http://www.wonderhowto.com/how-to/video/how-to-get-started-with-iphone-programming-in-cocoa-184437/


Friday, November 14, 2008

Updates to CAD_database

After I uploaded an erroneous table to the database, It was necessary to delete the whole thing and start over again. Today I reuploaded all the previous files that were in the database with a few new additions.

These new additions included the woodward_rooms files that I have created. These files are simply the RM$ layer or the CAD files and refer only to the room polygons. I have also put in two files called Woodward_test* to the database. These will be used for testing the automation process with Jianfei's program. I will begin putting more of these in with varying degrees of complexity so as to test how his program interprets these files.

Thursday, November 13, 2008

New buildings coming soon

I have put in a request with Fred Brillante from Facilities Management to get more CAD data files. I have requested that we receive preferably the SAC, Cone Center, College of Health and Human Services, and the College of Education.

The main idea with this is to create a "urban cluster" that will be similar to the type of data we will be working with in an actual urban environment. When this data is incorporated we will most likely begin testing out the automation portion of this project to see how effective our techniques are in setting up this type of data. We can then also begin testing various scenarios related to urban clusters and emergency management.

Tuesday, November 4, 2008

New database and files for automation process

I have created a new database entitled "CAD_database" which I will be using to put CAD shapefile in a PostGIS database. The purpose of this database is to hold all of the data that will be nessessary in order to automate the room number/name and type assignments.

Currently, I have the raw converted shapefiles in there along with a generalized version of the Cameron buildings polygon file. I have deleted all unessessary polygons from the file (called: cri_rooms*) and it only contains the room geometry.

I hope to use the original shapefile to further divide the polygons into several groups of files that will be used to relate room numbers to polygons in the table. There are several ways to achieve this, however, I must see what Jianfei wants to do in order to cater to his needs. Since each building, and sometimes floor, have different labels and layers this may be a hard process to accomplish.

Friday, October 31, 2008

Geo-referenced Raw CAD Shapefiles

Today I have been using the spatial adjustment tool in ArcGIS to geo-reference the CAD shapefiles I created. We have been moving towards using these in a effort to save time and set up the data so that automated creation of our other data can begin.

Currently, I have CRI and Woodward Hall CAD shapefiles correctly geo-referenced using the Mecklenburg County building footprint file as my building reference. This process only takes about 20 minutes per building, and is much faster than the previous method of copy and pasting the CAD polygons directly into our geo-database.

I have noticed some descrepincies in the types of CAD files from building to building. In the CRI files, the room numbers are actually contained in the polygon, point, and annotation layers of the CAD shapefiles. From what I have noticed, the Woodward Hall CAD shapefiles do not have this type of information anywhere other than the annotations made on the original CAD files. This will definently be a problem in the future, especially if we find that most building exclude this information from the CAD files.

I am now working on getting the Atkins library files geo-referenced and should be done with that by the end of the day.

Friday, October 24, 2008

Today's Progress 10/23

I finally got an email back from Kurt Olmsted about the data I had requested in his department. The Mecklenburg GIS team now has the 911 data and the master address data, and today I got access to their database of information.

This should prove useful, especially if the scope of the project increases in the future. I also have building footprint data for the entire county that I will be able to use as contour layers for buildings.

We have also been working on this automation problem. Jianfei and myself are currently working to come up with a solution. I am strictly using GIS while he is using analytical tools in a program that he is writing. I am hoping that we can come up with a solution to fix these time consuming issues.

Thursday, October 23, 2008

Todays Progress 10/23

Today I have been separating the Atkins library files into their specific classes. I now have for each floor a poly, stairway, and elevator file that has been separated into their specific classes.

For the time being I am not going to manually add room numbers. This is because we are coming up with a new method to assign this type of data that will hopefully make the process much easier and automated.

For next week I hope to incorporate this automation so that we can have a complete set of files for Atkins Library.

Wednesday, October 22, 2008

Today's Progress

Today I updated COITVISCENTER.uncc.edu, which is serving as our backup database with all of the new shapefiles, and geodatabase files. I need to start archiving things on a regular weekly basis now that we have much more data to store and handle. It would be critical if anything was lost on the local machine, which is where the majority of the source data is at.

Another thing I did today was email Kurt Olmstead, who is the directory of Mecklenburg County GIS services. I asked him for some help with this research project and if he could provide any relevent data for us. I met him earlier this week when he gave a presentation to our class and let me know that his department is now over theE-911 and MAT data for the county.

I am still awaiting an email back, but hopefully something good will come of this.

New effects and changes to NIJ Analysis program

Jianfei has made several changes to the NIJ Analysis program. One thing that has been added is a indicator of a fire by displaying animated smoke. Later on we are hoping to have thematic additions for chemical spills, floods etc. A screenshot of this smoke effect is displayed below.


Another addition that has been made is an airplane that can be dragged around the map to see the contents of a specific building. This will allow you to see the detailed view of the building by simply putting the airplanes light beam on the specified building. Later we hope to add more effects such as placemarks and possibly terrain. A screenshot of the airplane effect is displayed below:

Friday, October 17, 2008

Completed Atkins Library Files

I have finally geo-referenced and aligned all of the CAD data for the atkins library files. I now have everything converted over to geodatabase polygon features.

The next step will be to annotate and seperate the polygons into their respective classes and later create the hallway line and line junction files.

Now that this step is completed the rest of the conversion process should be much easier.

Here are some screenshots of the library displayed in ArcScene with 60% transparent layers:




Wednesday, October 8, 2008

Create New Postgres Users and Database Maintenence

Procedure for creating a new postgres user with proper attributes:

Use either:

CREATE ROLE name LOGIN;
or
CREATE USER name;

(CREATE USER is equivalent to CREATE ROLE except that CREATE USER assumes LOGIN by default)

Some Important Attribute and Privilege Changes for Users:

to allow database creation for user:
ALTER ROLE name CREATEDB;

to set a password for user:
ALTER ROLE name PASSWORD 'password';

to grant/revoke public access on a table or database can replace ALL with select, update, delete, etc:
GRANT/REVOKE ALL on tablename to user;

Database Maintenece and Backup Procedures:

Performing an SQL dump and retrieving the dump(generates SQL files for all tables in database and dumps to a outfile for backup):
pg_dump dbname > outfile;
psql dbname < infile;

or use:

pg_dumpall > outfile;
psql -f infile postgres;

Performing VACUUM maintentence to recover disk space and update database statistics:
VACUUM ANALYZE;
or just VACUUM;

Reindexing:
REINDEX;


These tasks should be performed every week to make sure our system is backed up and running optimally.











Friday, September 19, 2008

Project Update 9/19/2008

This week we have been working on finishing the implementation of hallway segmentation and multi-desktop connections.

I have finished updating the woodward and cameron files with hallway polygons so that we can facilitate blocking/segmentation. This seems to be working well and Jianfei has implemented these changes in his program. We now have the capability to choose polygons and block them off, as well as stairways and elevators, and entrances.

The issue of multi-desktop connections works in theory, the only problem is we have not been able to connect to PostGIS remotely at the moment. However, we have been able to make changes on two program running on the local machine and were able to see dynamic updates when a section of the building has been blocked off. Next week we should be able to fix the remote connection problem so that we can log in and change things from another machine.

Progress of digitizing the atkins floorplans is very slow at the moment, currently I only have the polygon data converted from CAD. I have been experimenting with different spatial transformation algorithms to line the data up but them keep coming up very skewed. I have since went back to my old method of manually identifying a building footprint and aligning all other layers to this contour file. I hope by next week I will have a final model of the building to put in the database, and in subsequent weeks I will begin the process of identifying polygons and annotating them for use in our program. Then the last step will be creating the hallway networks, which I am very unsure about because of the complexity of this building.

Here is a screenshot of my progress digitizing cameron, this is the first floor:


Wednesday, September 10, 2008

Changes in Mid-level abstraction and segmentation. Change in line file data model...

We have decided to change the way we are handling the mid-level abstractions for the program. We have ditched the idea of trying to derive our own segments arbitrarily, and opted for a method that will hopefully allow well-defined segments with a better way to select/block segments of hallway.

What we have decided to do is incorporate hallway polygons into our displaying layers that will define segments of the hallway. These segments have already been defined, and are derived from the CAD files. Each of these polygon segments (see Fig.1), and their corresponding line files will be labeled with a "SEGMENT_ID" that determines which segment a hallway line file belongs to.

These well-defined hallway polygons will also be used to allow the end user to select large "chunks" of hallway and block them off, or view more detail about a particular one. The problem we were facing before was defining hallway segments, and which corresponding line segments from the line files to include.

Due to this change in our abstraction method I have changed the Data Model for the line files that correspond to each building level.

The new model includes the following attributes:

OBJECTID
SEGMENT_ID
BUILDING
WEIGHT
flr_num
COMMENTS
SHAPE_LENGTH

Figure 1:


This screenshot details the hallway segmentation that we will be using, the selected area is a well-defined segment.

Friday, September 5, 2008

Project Update 9/5

This week we have been working on a method to facilitate the course-level abstraction we have been discussing. For now I am manually identifying different segments of the building that will be separate in Jianfei's course level model. These segments of the building can be blocked off or displayed separate from their constituents. In the future we hope to make this segmentation feature automatic, but for the sake of time I am manually identifying and labeling them for use in our demo. Progress on a demo is nearly complete and we should have something to present by early next week. We are also striving to have communication to the database established between multiple clients in order to facilitate external changes to the DB. I have also started converting the Atkins library files over to GIS data and the final product should be available within a couple of weeks. I am detailing the steps I go though to convert the data to what we need for this project, and I will provide a document detailing the steps and completion time for them. This document should provide a basis for the process that will be required if we move to automated conversion of this data. Another avenue I am exploring is the use of Archibus data in place of CAD files. Our campus supposedly has Archibus files for each building on campus and judging from this article: http://www10.giscafe.com/nbc/articles/view_article.php?section=CorpNews&articleid=500611, et al, this data can be incorporated into ArcGIS very easily now. I have emailed Fred about access to the Archibus system and the details of this kind of data, I am just awaiting a response.

Tuesday, September 2, 2008

New CAD files

I have finally received the new CAD files for the library. Throughout this week I am going to be starting the process of converting them to use in our project. I am also going to make a detailed description of how I convert the data so in the future we may move to an automated process for this.

Some other things that are going on is segmenting Woodward hall and also finishing up some scenarios/demo that we can show off.

Tuesday, August 26, 2008

Tuesday

Today I added the "segment" and "weight" columns to the line files for each building. I am still not sure about what to do with the segment column so I am just arbitrarily selecting zones and lableling them. I am not going to spend much time on this because I have more important things to do. This includes:

Adding and converting new building from their CAD files and exploring the possibliity of using Archibus for data. I also would like to get a new version of the project web site up and running as soon as I possibly can. ( preferably by this weekend)

Thursday, August 21, 2008

Connected Indoor Graph to Outside Road Network

I have completed updating the line files for the indoor built environment so they now connect to the external road network. This was made possible by editing the existing line files so they connect to points in the road network junction file.

Jianfei is now undergoing testing so we can begin setting up scenarios. We now also have the capability of searching for a geocoded address and having a marker indicate the location of the typed in address.

Here is a screenshot showing the newly completed model:

Wednesday, August 13, 2008

Scenerios and website progress

Over the past week I have been working on connecting the line files to the existing road networks. This will allows us to facilitate route-finding on a much larger scale and begin to develop a few scenarios that we can test. I have also started to re-design the website, and add more content to make things a little more informative and navigable.

I would also like to start refining our map design and layout so that it's as informative as possible without having too much information. This should be finished in the coming weeks.

Thursday, July 31, 2008

New Website location, finished updating of woodward junction files ot 7-digit format

The website is now sitting at http://www.viscenter.uncc.edu/nij-gis/home.htm which is where it will be hosted from now on. The link to this blogs feed is also available from this index page. I hope to start finalizing the design of the site by next week and have more content available.

I have also finished updating the junction file for woodward hall to reflect the new 7-digit identifier code that Jianfei and I came up with. It seems he has also been able to load the cameron database in his graph program with only a few errors. I am in the process of working these out and hope to have a final display by the end of the week.

Wednesday, July 30, 2008

All CRI files completed

I have finally finished uploading the last of the files to the PostGIS database for CRI building. The junction files contained also have the new annotation that we have developed, even though it's much more time consuming and a bit redundant.

Some things that still need to be done are:

  • Finish up web page design, and secure space on the webserver for this page to be stored on
  • Eliminate errors from the datasets as we come across them
  • Change the woodward junction file so that it has the new annotation that Jianfei's program will support
  • Get the Interoperability extension working so I can read/write directly to the PostGIS database

Sunday, July 27, 2008

Collaboration Ideas and Progress on CRI

Today I have been working on the completion of the entire cameron dataset. The only things left to do is to create and annotate the junction files for the graph model and do some cleanup. I also need to add a few things to each file such as floor number columns and the name of the building itself. There are also several errors (specifically cameron_flr1_strwys) that need to be taken care of. I hope to finish this dataset and have it in PostGIS by Monday.

I would also like to be able to connect directly to the database using ArcCatalog to that I will not have to go through the trouble of SSHing into the server using a virtual machine and copy the files up from there.

Some other things that I have discovered today are very important and I need to start emailing people for input. The main things I discovered are two web sites: http://www.gita.org/ and http://www.opengeospatial.org . These web sites have a ton of information about the standards used for CAD/GIS integration, indoor GIS, emergency management, and other resources that would be very useful for the project. There are also listservs that I can start using to contact people about questions related to our project. Some other interesting things are also available such as standard data models for the type of data we are using and links to projects and people that will definatly be able to help now and in the future.

Friday, July 18, 2008

zigGIS

I just made a great discovery in a ArcGIS extension called zigGIS.

From their site:

zigGIS is an ArcGIS Desktop extension that allows you to connect directly to spatial data stored in PostGIS. It is a lightweight option to allow you to centralize your spatial data into the leading open-source spatially-enabled relational database.

Using zigGIS, you will be able to take advantage of the advanced analysis and cartographic tools of ArcMap while leveraging the superior spatial data storage and management capabilities of PostGIS. zigGIS will enable you to view, analyze and edit your PostGIS spatial data from within ArcMap.

The most exciting new feature of zigGIS is the introduction of multi-user editing of PostGIS data from within ArcMap. zigGIS now includes tools to enable you to check out your data and make edits with the native ArcMap tools.

Best of all, zigGIS enables all of this capability for users of ArcView on up without the need for additional middleware.

(via: http://www.obtusesoft.com/)

This is awesome news for me since I use PostGIS everyday. The only gripe I have is no support for ESRI (at least not editing). This is going to make my life a hell of a lot easier and data management a breeze for me. Because, currently uploading spatial data to PostGIS is a bitch of a process when going from a windows box using ArcInfo to a PostGIS database on a linux server.

More Progress

It seems that I have finally corrected any errors that were occuring that were sending Jianfei's graph out of whack. These seem to have been most disconnection errors or labeling errors. We have been running tests using the path-finding algorithm and everything seems to be working OK. There are still some errors that are occuring due to problems with how the algorithm works and deciding on using stairways and elevators. These problems should be easy to fix.

Right now I am working on incorporating different data sources into ArcScene to try to create a composite scene with the graph model, woodward files, and other GIS data layers that might be relevent.

Here are some screens of how the graph is looking and a couple looks at the incorporated GIS data in ArcScene:



This mainly shows the internal structure of the hallways and paths to get around the building. This graph is what we will use for route-finding, and we hope this will eventually get incorporated into the street networks so we can do route finding on a larger scale (hopefully citywide)






These two images show how different data sources can be incorporated into our current work. I have created 3d building of the campus and added in roads and the woodward and cameron files. These files are obviosly much more detailed, however, having this type of view is nessessary not only to make it easier to visualize, but easier for an emergency worker to find their way around.

Wednesday, June 25, 2008

Finished Standardization of Junction Datasets

It was requested that I update the junction file to add capabilities for building with floors greater than 10. My previous code would not work if the floor was above 9 since I didn't allow for an integer that large. This problem is now fixed and the junction files have been uploaded to the newer database that Jianfei created called nij_junctions.

The next things that I will be working on is finding centerline extraction techniques and some more info about CAD and GIS integration.

Tuesday, June 17, 2008

Almost finished annotating junction file

I have almost completed annotating all of the junctions for the indoor graph model. The only floor I lack is floor 4. While I was messing around with the data today I saw several things that needed to be fixed. One of which is anomolous polygons throughout some of the datasets (most notably woodward_flr3_poly) there are several polygons that need to be labeled or deleted because there are way too many records in the table that aren't needed.

I will probably wait until next week when I don't have anything to do and tackle this. The PostGIS database will also need to be re-updated once I have all of this data finished and compiled. I am hoping that by the end of this month I will have a "1.0" version of the datasets and database completed with little to no errors in the data.

Monday, June 16, 2008

New annotation for network junctions shapefiles

Jianfei brought to my attention the fact that it is very hard to find out what junctions correspond to what on the graph he is making.

So what I am going to do is go through and manually annotate the network junction file to indicate what the junction is. The labeling schema will be similar to this:

Elevators : E###
Stairways: S###
Exit/Entrance: M###
Room: R###(alpha)
Everything Else (null): N###

This should make it a bit easier to interprete when he goes in and creates this floor centerline graph. I also added support for Z-coordinates and a spatial index on these new shapefiles.

Thursday, June 12, 2008

Database issues

Today I have been battling some database issues that seemed to be related to the schema in the sql files. For some reason, unless you specify the schema to be public, the database will not show the tables when using the \d command to list all tables.

The way I fixed this was to again go through and redo all of the sql files and specify the schema as public. This seems to be working for the time being and all of the tables are showing up when executing the \d command.

Jianfei showed me his viewer today and it looks great, hopefully now that we have everything working we will start progressing much faster.

Wednesday, June 11, 2008

Database Update

Since we have just installed a new version of Postgresql and PostGIS with GEOS support, I decided to reload the tables into the database. I also wanted to add a GisT index to make queries a bit faster.

The first thing I did was drop all of the tables that were currently in the database. Then I took the newest shapefiles from my machine and converted them to sql using the shp2pgsql function. I then uploaded the new files into the nij_database under the schema name "nij". The tables are currently there now and i have not only put the woodward files in, but also the cameron files.

Another thing I did was convert the junction files that I had created with the network datasets and put them in the database as well.

I am hoping that Jianfei will have a viewer ready by next week, this will give me a better idea of how things are working and if I need to make any changes to the tables or their geometries.

Monday, June 9, 2008

New versions installed

Looks like the system admin. installed the newest version of GEOS, and PostGIS to the linux machine where the database is kept.

I haven't had a chance to test anything out yet, since I am working on backing up files on removable media.

Tomorrow morning I should be able to put everything through a good test to make sure it's working normal. Hopefully I will break something...

Friday, June 6, 2008

GEOS problems

Seems like we have limited functionality in PostGIS since we haven't yet installed GEOS support. This will add functions that will allow us to extract geometry information from the tables we created.

Currently we are waiting for the system admin. to install this software and set up the other packages that are dependent on it. I have also put in a request to have Quantum GIS installed on the linux box so that we will have a way to view the GIS data we created.

Next week I am hoping we will have a prototype viewer and all of the datasets uploaded with all of the GEOS functionality that we need.

Thursday, June 5, 2008

Finalizing woodward database

I have sent the SQL files to Jianfei to upload to the database. These should be the final builds of this data, and hopefully it will work with little to no errors.

I have also been able to test the Quantum GIS tool to view the tables graphically using only the table information. This confirms that the "the_geom" column does contain all the needed information for interpreting geographic information.

The next step will be to keep sending datasets to the database and hopefully develop an entire database of two building as well as a rudimentary viewer to look at these files.

Tuesday, June 3, 2008

PostGIS information

Today I discovered several good websites for PostGIS information. The first lists all of the functions that are included in PostGIS and how to execute them. That site is http://www.bostongis.org/postgis_quickguide.bqg

The next thing I have found is a viewer that reads directly from PostGIS, this utility is called Quantum GIS, and it is open source and cross platform. The website is www.qgis.org

I am currently working with Jianfei to start construction of our own viewer and also uploading more datasets to the main database.

Wednesday, May 28, 2008

Database development in Postgresql

Database development seems to be coming along nicely now that I have a good idea of how PostGIS and PostGres work.

In order to load a shapefile into the database I simply use the shp2sql function to convert the file. This also adds a geometry column called 'the_geom' to the table that contains a rather long hexadecimal number. In order to read this you can issue SQL queries to find geometry information such as vertices, areas, and lengths. This will be how Jianfei reads in the information he needs to create an indoor viewer for the building.

Hopefully be next week I will have the kinks worked out on the GIS database end and Jianfei will have a program that can read in these variables through the libpqxx libraries for C++.

I will be going out of town for the rest of the week and return on Tuesday to start finishing this stuff up.

Friday, May 23, 2008

Uploading shapefiles to PostGIS

I will be able to upload the shapefiles of the datasets I created as soon as Scott can fix my linux account. Seems like when I was gone for a few weeks without logging in my account reset or something. So I am basically setting here with a bunch a data that I can't do anything with. As soon as I can get this data uploaded I can start using the API and bang out some code or something.

Thursday, May 22, 2008

PostGIS info

I am currently trying to implement PostGIS databases for use in our project. So far I have installed the software needed to run the open source database, and I am currently reading through the documentation trying to get an idea of how it works.

Some interesting things that I would like to have are:

GiST Indexes
GiST stands for "Generalized Search Tree" and is a generic form of indexing. In addition
to GIS indexing, GiST is used to speed up searches on all kinds of irregular
data structures (integer arrays, spectral data, etc) which are not amenable to normal
B-Tree indexing.
Once a GIS data table exceeds a few thousand rows, you will want to build an index
to speed up spatial searches of the data (unless all your searches are based on
attributes, in which case you’ll want to build a normal index on the attribute fields).
The syntax for building a GiST index on a "geometry" column is as follows:
CREATE INDEX [indexname] ON [tablename] USING GIST ( [geometryfield] );
Building a spatial index is a computationally intensive exercise: on tables of around
1 million rows, on a 300MHz Solaris machine, we have found building a GiST index
takes about 1 hour. After building an index, it is important to force PostgreSQL to
collect table statistics, which are used to optimize query plans:
VACUUM ANALYZE [table_name] [column_name];
-- This is only needed for PostgreSQL 7.4 installations and below
SELECT UPDATE_GEOMETRY_STATS([table_name], [column_name]);
GiST indexes have two advantages over R-Tree indexes in PostgreSQL. Firstly, GiST
indexes are "null safe", meaning they can index columns which include null values.
Secondly, GiST indexes support the concept of "lossiness" which is important when
dealing with GIS objects larger than the PostgreSQL 8K page size. Lossiness allows
PostgreSQL to store only the "important" part of an object in an index -- in the case
of GIS objects, just the bounding box. GIS objects larger than 8K will cause R-Tree
indexes to fail in the process of being built.
Using Indexes
Ordinarily, indexes invisibly speed up data access: once the index is built, the query
planner transparently decides when to use index information to speed up a query
23

Wednesday, May 14, 2008

More Progress

I have finished cleaning up all the datasets for CRI and they are now ready to be exported to the database as soon as Kirk comes in.

The only thing left to be done is to finish cleaning up Woodward floors 3 and 4 and I will have all of the datasets completed, as far as rooms, stairways, and elevators go.

The next thing that I will be working on is creating floor networks for CRI and then after that is completed I am going to start working on understanding the ArcObjects model and start building things from that.

Thursday, May 8, 2008

Things still to be done:

  • clean up cri floors 1, 2, 3
  • clean up woodward floors 1, 2, 3, 4

  • export data sets to shapefiles for kirk
  • upload shapefiles to a database using PostGIS and PostGreSQL

Data Cleanup

Today I focused mainly on trying to clean up the data that I have been creating.

There are several error and anomalies are in the datasets that I created. These include rogue polygons that were copied over from the CAD files that are no use to use, and also other irrelevant features. Some other things that need fixing are: a complete account and database entries for each and every polygon sorted correctly. This process will involve making sure room numbers and other relevant information is present in the datasets for both Woodward Hall and Cameron Research Center.

Once this process is complete I will feel comfortable exporting this data to a more permanent database using PostGIS.

Monday, April 28, 2008

Screenshots

Here are some screen shots relevant to the project, some I have posted before but this will put everything in one place:


Sunday, April 27, 2008

Finished Network Datasets

I have finally completed the process of annotating the network datasets with room numbers. I have sent this information to Jianfei and hopefully he will be able to start graph creation on his end.

The next thing that I am going to start working on is connecting these graphs to existing road networks and then when I am finished with that I will begin the process of making network datasets for Cameron Research Center.

Wednesday, April 23, 2008

Codes to remember

In order to store attribute data for the network dataset nodes, I have decided to use codes in the database to differentiate each point from another. These are the codes I am using for reference:

5555 - stairway
4444 - building exit
3333 - hallway node
2222 - elevator
1111 - null

If the 'ID' attribute of the floor junctions shapefile has a 3 integer number then it is assumed to be a room number, not a code. I am hoping that using this will allow Jianfei to differentiate in his program a little easier, and give him a way to connect and represent important connection points.

Here is a screengrab of the 3d network dataset created using ArcScene, I was going to post it yesterday but Blogger was acting up:

Tuesday, April 22, 2008

Information about PostgreSQL and PostGIS

PostgreSQL
PostgreSQL is a powerful, open source relational database system. It has more than 15 years of active development and a proven architecture that has earned it a strong reputation for reliability, data integrity, and correctness. It runs on all major operating systems, including Linux, UNIX (AIX, BSD, HP-UX, SGI IRIX, Mac OS X, Solaris, Tru64), and Windows. It is fully ACID compliant, has full support for foreign keys, joins, views, triggers, and stored procedures (in multiple languages). It includes most SQL92 and SQL99 data types, including INTEGER, NUMERIC, BOOLEAN, CHAR, VARCHAR, DATE, INTERVAL, and TIMESTAMP. It also supports storage of binary large objects, including pictures, sounds, or video. It has native programming interfaces for C/C++, Java, .Net, Perl, Python, Ruby, Tcl, ODBC, among others, and exceptional documentation.


PostGIS
PostGIS adds support for geographic objects to the PostgreSQL object-relational database. In effect, PostGIS "spatially enables" the PostgreSQL server, allowing it to be used as a backend spatial database for geographic information systems (GIS), much like ESRI's SDE or Oracle's Spatial extension. PostGIS follows the OpenGIS "Simple Features Specificationfor SQL" and has been certified as compliant with the "Types andFunctions" profile.

Conclusions
While I have not been able to test out this database technology, it seems very promising for our applications. I will hopefully have a working version of this open source technology and have a test database set up in it. Then I should be able to evaluate the performance of this system.

Network Dataset Progress and Website completion

Network Dataset Progress
Today I have focused mainly on updating the network datasets to include attributes and have nodes for each room. This process was rather time consuming, since I had to refer to a CAD drawing to determine room numbers then manually enter in the data into the database. I have so far finished two floors and hope to have all four finished by the end of this week.

Project Website Progress
I have finally finished the initial design for the project website. So far I have all of the pages set up with some content in them and a link to this blog's RSS feed. The only thing that needs to be done is some simple formatting, maybe some graphics, and finally the remainder of content that is available. I would provide a link, but as of now the site hasn't gone public yet.

Wednesday, April 9, 2008

Progress on digitizing Cameron Research Center

While I am still unable to fully convert the dBase files into SQL scripts automatically, I have decided to start expanding on the data I already have.

Since Cameron Research Center is much smaller, less complex, and has fewer rooms I have decided to start digitizing the building elements like I did for Woodward hall. I was able to create feature classer for each floor's rooms (polygons) and still need to digitize hallways and elevators.

I will then take the much simpler table from this building and begin exporting it into a Oracle database. I also need to start looking for ways to automate these processes.

Tuesday, April 8, 2008

dBase to SQL Issues

In order to export the geodatabase, I must first convert the files to dBase and read them in a third party program. Once I have the registered version I will be able to convert directly to SQL scrips or .CSV files.

The one problem that I am facing now is that the ObjectID field, which is a primary key, does not convert over to dBase format. It's of type "objectid", which, when displayed in DBF Manager only shows up as zeros. I will have to find a way to assign object ID's in a different type so they can be transferred over.

Sunday, April 6, 2008

Future Goals, Database Info

Next week I plan on starting the Geocoding aspect of the project. The first thing I will do is consult with Dr. Paul Smith about possible methods. i will then add entries to the MAT in order to put points on the map where I need them. I will most likely have to manually do this. After this is done, I will have to explore more options in implementing the data in 3D.

I also want to export the database so that SQL calls can be made programmatically.

If I can get all of these things done I will have a good week....

Monday, March 31, 2008

Combining 2D and 3D data through Geodatabase

I stumbled across an interesting powerpoint presentation from the UC ESRI Conference in 2007. It's about a hybrid data model for fire evacuation and is very similar to the project I am currently working on. It was proposed by Inhye Park from the University of Seoul, Korea.

The "hybrid" model proposes that a 2d environment maintained in GIS and a 3D environment created somewhere else, all linked together through the Geodatabase using SQL calls.

Here is an example from the presentation:

The first part, converting the CAD files to Geodatabase files, I have already completed. The database has been build, but we have yet to link it to any other 3D data models yet. This is what we hope to accomplish in the coming weeks.

Below is a graphic depicting how the model will work once completed:

Wednesday, March 26, 2008

Interfacing with existing Mecklenburg Master Address Locator

I have obtained the master address locater for Mecklenburg county. This data is stored in a shapefile as point features with a large amount of database info. Each point on the map below represents an address that has been geocoded, the table beside of it represents the attribute data for that specific address:



I will use this existing data to create the references to indoor locations. I believe by giving each room a point in the address locater, it will be possible to do query's and other such data manipulation with the indoor data. Since I already have the point locations for Woodward hall it will be as simple as copying and pasting the points into the Master Address table and updating the attribute information.

Since there is no "standard practice" for indoor geocoding I will have to develop a model to use that will enable me to do query's and location analysis the same way as regular addresses.

Once this is finished I should be able to facilitate route finding indoors by not only selecting points in space, but also by running queries on the location name or room number itself.

Tuesday, March 25, 2008

Goals for this week...

Today I have mainly been collecting datasets and other things so that I can continue making improvements to my prototype. The first thing that I have now is a coverage that has the latest street centerlines. However, it seems to not be entirely complete for UNCC so I will have to later change it to a shapefile so I can add roads to it. It will also be unnecessary to build a network from the shapefile in order to do analysis on it.

I also need to start the process of creating the hallway networks in GIS. For now I am only going to focus on just the centerline shapefiles and creating a network out of them, then connecting them to the existing street centerline files.

If time allows, I also need to clean up the work I did previously and correct some alignment issues that I have been having.

I hope to start following a new data model that I have stumbled upon on the web. It's referred to as the Building Information Spatial Data Model (BISDM) and relates to indoor GIS applications. I am sure that this will come in handy in the future, the link is http://bisdm.org/information/

Tuesday, March 18, 2008

Server backups, shapefile conversions, and progress on network creation

Now that I finally have an account on the Vis Center server, I have backed up all of my database files and other pertinent information there. I will be periodically updating a copy of the database each week.

I have also taken all of the feature classes and converted them into shapefiles so that Jianfei can use them for his program. These are stored on the server and on the local machine under the directory 'shapefiles'.

I have begun to create networks for each floor. I have finished the first floor and everything seems to be working well. I have been able to use the find route commands and map routes to rooms from the outside doors. The next step will be to create this data for each floor and somehow have them link up. This may be possible by inputting elevation data in the network.

Here are some screenshots of what the network dataset looks like along with the floor one datasets:


And here is a screen of the network by itself, the dark line indicates the shortest route between two points:

It's important to note that in order to maintain connectivity you have to have connections at endpoints connections along edges that are continuous will NOT WORK!! I tried this earlier by drawing whole line segments for hallways but it did not work since each connection point has to be an endpoint for connectivity to work correctly. Without this route finding will be erroneous or will not work at all.

Friday, March 14, 2008

All feature classes are now finished

I have just finished creating all of the feature classes that I set out to do earlier. Everything has come together quite well, with the exception of a few alignment issues in the 3D view. I have all 4 floors correctly represented with feature classes for hallways, stairways, elevators, and rooms. Here is what the final result looks like:


After removing the rooms layers a nice view of the transportation structure of the building is provided. Here is a screenshot of how that looks:

Now that I have this completed, the next step be creating a geometric network that represents the same thing with line features. Then we will be able to do analysis and route finding.

Monday, March 10, 2008

New feature classes for hallways, stairways, and elevators

Today I have updated the database to include 3 new feature classes: Hallways, Stairways, and Elevators. I have annotated them correctly and only need to finish floors 3, 4, and 5.

I am currently waiting for Joel to come up and install ArcInfo, then I will be able to expedite the process of creating and editing the feature classes.

Thursday, February 28, 2008

3-D data representation

Today I have been working on representing and displaying my data in 3-dimensions. I wanted to do this just as a proof of concept that the 3-D generation will work properly with my datasets.

I only have 2 floors at the moment, but the 3-D display in ArcScene is stunning.

Here are some screens I took today:

Floor 1 in 3D:



Floor 2 in 3D:




Composite:



Composite with orthophoto reference:

Wednesday, February 27, 2008

Dataset progress, solution for storage

I have finished the second floor polygon feature class and have most of the rooms generally annotated. I am happy to say that everything is geocoded and fits perfectly where it's supposed to be.

Right now I'm waiting on ITS to install ArcINFO so I can start creation/maintenance of my geometric networks. In the meantime I am going to be finishing things up on the rest of the datasets and annotating them properly.

My biggest concern at the moment is I have no place to store the geodatabase other that my local machine. This is a problem, since this is a shared computer, who knows what might happen to my work. I also need to get my datasets out so that my other colleagues can work and edit them. Since it's not as simple as just emailing a dataset, I need some way to store the geodatabase on a server and allow FTP capabilities. This is something I hope to resolve by Friday's meeting. I would rather not have my data on a local machine over Spring Break.

In conclusion, here is a great shot of what the completed polygon feature classes look like overlayed on the orthos with roads as another layer:

Tuesday, February 26, 2008

Progress on room polygon feature class and georeferencing

Today I focused mainly on georeferencing all of the files that I have into NAD1983 Coordinate system. I have completed a good deal of this work, and am now focusing on extracting room polygons from the CAD data and pasting them in their own feature class in the geodatabase.

The room polygons have been extracted form the CAD files and this is what it looks like:




I have also updated the database to reflect the room numbers and brief descriptions of each room. The process was rather painstaking and involved using the predefined polygons from the CAD files and copying them over into a new feature class in my database. I then used the CAD files to determine room numbers and other information. The areas of the rooms nearly match perfectly with the polygons in my feature class with only a few hundredths of a degree of error, which is to be expected.

The next step is to create the line features and convert them into a geometric network. This will allow me to focus on the big picture.

Thursday, February 14, 2008

Geometric Networks and Arc C++ SDK

Geometric Networks

Dr. Lee has an interesting paper about the 3D data model for representing relationships of urban features, it can be found here: http://gis.esri.com/library/userconf/proc01/professional/papers/pap565/p565.htm


Another interesting paper about Transportation networks in ArcGIS can be found here: http://gis.esri.com/library/userconf/proc02/pap0437/p0437.htm

It seems that ArcGIS has the functionality to create topologically related geometric networks, but only on the 2D scale. There are also network analysis tools integrated into ArcGIS that can provide such functions as "find path" that will allow us to minimize the amount of code that we have to write to get our system working.


Implementation can be done either by creating a new geometric network or making one from an existing file. For our application, it will probably be done by starting from scratch.

Here is a screen shot detailing the functionality of the Analyst:




ArcGIS SDK

I have installed the SDK on the computer I have been using in the VIS lab, it has an Object Browser, GUID generator, and library search tools. This should prove useful when interfacing with C++ or any other programming language.

Wednesday, February 13, 2008

ArcGIS SDK, ESRI dev network, and database files

ArcGIS Desktop SDK for Visual C++

I just acquired a copy of a education version of ArcGIS 9 from the Geography department, and it has the C++ SDK on it. I haven't yet had a chance to play around with it, but it should prove to be very useful for the project.

I have also signed up to the ESRI developer network, which has a code exchange, scripts, and other useful information. This service will no doubt prove useful in the future. The weblink is: http://edn.esri.com/

I have also located a useful article that explains in detail how to compile and run applications written in C++ using the SDK, that link is located here: http://edn.esri.com/index.cfm?fa=codeExch.howToCSamples

Progress on Shapefile annotations, database files, and georeferencing...

The shapefile format actually consists of 3 files, one of which is a dBase file that can be read by MS Access. This file will be used to link the spatial info to the right attributes. Currently, the dBase file is quite cluttered since the CAD files were so detailed (there are point features for individual seats in rooms), the next step would be to clear all of the unwanted information out, and expand of the information that is needed. Many of the features need to be discarded for our purposes, and once this is done it should be easier to work with the files.

I have made some advancements on georeferencing the CAD files to it's appropriate place in an orthophoto. By choosing control points on a georeferenced map, you can link them to other control points you have defined on a CAD file, shapefile, or image. They are then transformed to the new coordinate systems and line up with the control points. The initial test of this didn't seem to work well, as I was not able to get the CAD file to "drape" over the appropriate area. It did indicate that it was matching the coordinates I had defined earlier, however. My goal here is to eventually have a model that displays the CAD or shapefile draped directly over where it's supposed to be in the orthophoto. I could later use this referencing to create a 3D model of the build environment.


Editing Geometric Networks...

This is a tool that is used to represent connectivity in utilities systems, and may have some use for our purposes. By defining connectivity rules, you can go through and create a geometric network that represent connectivity relationships spatially. There is a possibility that we can use this in conjunction with a route finding algorithm to visually display routes in ArcGIS.

Tuesday, February 12, 2008

Info on Shapefile and progress on conversion and developing a working model.....

Progress

Today I finished converting the .dwg files over to .shp files for our use. I have successfully created shapefiles for all floors of Woodward Hall and the other building (I think it's CRI). I have discovered a way to create all shapefiles from the .dwg documents in ArcGIS. I have all of these files stored on my H: drive currently, and hope to have them stored somewhere else shortly.

Before I was only able to convert the polyline and polygon features, but no I have the ability to create shapefiles that represent annotation, multipatch, and point features.

Next Steps...

The logical next step would be to start annotating features that are in the shapefiles that I created. This will involve specifying room numbers, identifying which wall are outside/inside, and determining hallways and other information.

Shapefiles


I have been reading into information about the shapefile format and stumbled across a helpful pdf file put together by ESRI:


http://shapelib.maptools.org/dl/shapefile.pdf

It contains some useful information about the format of shapefiles and specific information about what is in header files, record contents, and how features (polygons, lines, areas) are represented in the file. This will be useful if interacting with the shapefile using a programming language.

Shapefile C Library

Jianfei pointed this site out to me earlier this week. After further examination it seems to be a great source for a shapefile API. The link is: http://shapelib.maptools.org/

Final Thoughts...

It will soon be a necessity to have a central location for storing the files I have been creating. My H: drive is simply too small to contain all of the shapefiles that we are going to need for our models.

I have also contacted Paul Smith about concerns I have about the project and other GIS questions. I have asked that he come to one of the meetings on Friday to discuss information he might have that will be helpful in our project.