Friday, March 13, 2009

CoreLocation Integration in mobile application

Today I have been reading through the documentation for the CoreLocation Framework in the iPhone OS. This framework allows you determine the users location and users the CoreLocationManager interface. This interface allows user defined accuracy, distance travelled, altitude, and error.

This feature should be a useful addition to the mobile application if we can incorporate it into the program effectively. I think that if we can provide a interface and a way for the user to see their location it will make the application much more usable.

For the time being, I have written a simple program that tests out the CoreLocation framework by pulling latitude, longitude, altitude, and their errors. Below is a screenshot of that application.




This is another example of why the iPhone simulator doesn't let you test all aspects of your application. As soon as I am able to test this on a real device, I should get correct results that I can check for errors.

I have also been going through more SDK documentation to see what other things I can add to make my program more usable. I hope that I can implement some more features that use the iPhones accelerometer, web browser, and possible other features.

I will be returning to Charlotte tomorrow, and that should allow me to talk with Jianfei in order to implement our code to read building geometry. I will post progress regarding this next week.

Wednesday, March 11, 2009

iPhone application progress

Today I have been working on the latest implementation of the NIJ Mobile app. Currently I have the functionality to display buildings as images (this will be changed later to read polygon data from building vertex file), query the database from the application, as well as implementations of some of the iPhones features such as core location, multitouch, and the accelerometer.

Today I have been setting up the OpenGL views and buffers that will allow me to draw using OpenGLES in the iPhone. I currently have one building being displayed and the user is allowed to rotate, zoom in, and move the view around. Since the iPhone simulator that came with the SDK does not allow me to test out the multitouch, I cannot test the zoom and rotate controls at the moment.

In the next few days I hope to add more functionality, most notably the ability to read from the vertex building files that Jianfei has been reading from his program. He already has the code for this and I hope that a successful conversion from his C++ code to my Objective-C iPhone app will be possible.

Below is a screenshot of the main Map view for the application:




This image shows the main map which allows the functionality described above. I have also implemented some simple function buttons on a toolbar to jump to a query, the main view, to refresh the current view, and to go back to the previous screen.

In the next few days I will be reading up on Objective-C and the iPhone book that I acquired earlier this week. I hope that I will be able to complete the addition of the new features by the deadline of April 19th when we will be demoing the mobile app for the people funding NIJ project.