Today I have been working on the latest implementation of the NIJ Mobile app. Currently I have the functionality to display buildings as images (this will be changed later to read polygon data from building vertex file), query the database from the application, as well as implementations of some of the iPhones features such as core location, multitouch, and the accelerometer.
Today I have been setting up the OpenGL views and buffers that will allow me to draw using OpenGLES in the iPhone. I currently have one building being displayed and the user is allowed to rotate, zoom in, and move the view around. Since the iPhone simulator that came with the SDK does not allow me to test out the multitouch, I cannot test the zoom and rotate controls at the moment.
In the next few days I hope to add more functionality, most notably the ability to read from the vertex building files that Jianfei has been reading from his program. He already has the code for this and I hope that a successful conversion from his C++ code to my Objective-C iPhone app will be possible.
Below is a screenshot of the main Map view for the application:
This image shows the main map which allows the functionality described above. I have also implemented some simple function buttons on a toolbar to jump to a query, the main view, to refresh the current view, and to go back to the previous screen.
In the next few days I will be reading up on Objective-C and the iPhone book that I acquired earlier this week. I hope that I will be able to complete the addition of the new features by the deadline of April 19th when we will be demoing the mobile app for the people funding NIJ project.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment