Skip to content

Latest commit

 

History

History
267 lines (185 loc) · 12.1 KB

research_log.md

File metadata and controls

267 lines (185 loc) · 12.1 KB

Research Log

This is a log of all my findings and processes while making this project.

2.07.2014

For my first step, I want to figure which sensors I should use on the device to check the device orientation, for taking pictures. I want to do this so that I can make sure that the user is taking upright pictures every time. Upon looking at the different types of sensors that are available on the Android platform, there are a few that stood out for me and might be of use:

  • TYPE_ORIENTATION : Measures degrees of rotation that a device makes around all three physical axes (x, y, z). As of API level 3 you can obtain the inclination matrix and rotation matrix for a device by using the gravity sensor and the geomagnetic field sensor in conjunction with the getRotationMatrix() method.
  • TYPE_ROTATION_VECTOR : Measures the orientation of a device by providing the three elements of the device's rotation vector.

In the documentation, it says that TYPE_ORIENTATION can be used for determining device position while TYPE_ROTATION_VECTOR is used for motion detection and rotation detection.

Below are the available sensors on my testing device, an HTC One M7. Both sensors that I'm interested in are present.

  • 3-axis Magnetic field sensor
  • 3-axis Accelerometer
  • Proximity sensor
  • Light sensor
  • Gyroscope sensor
  • Orientation sensor
  • Rotation vector
  • Linear acceleration
  • Gravity
  • Gesture sensor

The coordinate system for the world

The coordinate system for the world

The coordinate system for a device

The coordinate system for a device

Based on the documentation, for now it seems that I should use the orientation sensor. It provides the following data via [getOrientation](http:https://developer.android.com/reference/android/hardware/SensorManager.html#getOrientation(float[], float[])). All units are in radians:

  • SensorEvent.values[0] : Azimuth (angle around the z-axis)
  • SensorEvent.values[1] : Pitch (angle around the x-axis)
  • SensorEvent.values[2] : Roll (angle around the y-axis)

2.09.2014

For my UI, I want to have something similar to the panorama feature on my HTC One running KitKat 4.4.2 Sense 5.5. When the device is in landscape mode, but the device's Y-axis isn't parallel to flat ground, it displays a line conveying this information, like so:

Device with Y-Axis not parallel to flat ground

When the device's Y-Axis is parallel to flat ground, the dotted blue line will overlap the gray dotted line like so:

Device with Y-Axis parallel to flat ground

I'm doing this so that all pictures that a user takes around an image will be as straight as possible. The way I calculate this is by using the device's gravity sensor to measure the acceleration due to gravity in the x-direction. If the absolute value of the measured acceleration is within an epsilon of 9.8 m/s^2, it's Y-axis is parallel to flat ground. Otherwise, there is a significant component of gravity on another axis.

Current Progress

Currently I have several TextViews displaying the acceleration due to gravity for each axis, as shown below:

Current Progress


2.12.2014

Current Progress

I just finished making a custom view titled HorizontalOffsetView. It determines whether or not the device is horizontal with respect to the ground. When I put this on top of the camera view, users will know when they are taking horizontal pictures. Currently I'm using the gravity sensor to do this, but I think I'll have to end up using the orientation vector sensor, because that will let me know about the horizontal condition, and will let me know when the user has done a 360 on a particular axis. HorizontalOffsetView has very similar behavior to the custom view seen in the panorama view of HTC Sense 5.5's camera app. Here are some screenshots:

Non-Horizontal

Non-Horizontal View

Horizontal

Horizontal View


2.22.2014

Current Progress

When creating the UI to determine the direction was facing, using the rotational vector and unit quaternions turned out to be too much of a challenge so I decided to use the magnetometer in combination with the accelerometer to make a compass of sorts. For debugging purposes, each gray circle has the angle in degrees off North. The circles move depending on what direction the user is facing. The circles are formed using the AlignmentCircle and YawCircle classes. Here are some screenshots:

Unaligned with yaw circle

Unaligned with yaw circle

Aligned with yaw circle

Aligned with yaw circle

Future Goals

Next up, I need to put the HorizontalOffsetView over a camera view so the user can see what pictures they're taking, as well as functionality to take a picture. HorizontalOffsetView is missing UI to indicate the angle off the horizon the camera is so that will have to be added in. I'll also need to be able to tell the horizontal displacement from picture to picture to determine if the user has moved too much and prevented a good overlap of pictures. I'll see if I can do this using the accelerometers on the device.


3.4.2014

Current Progress

I have now overlaid the HorizontalOffSetView with a live camera preview. In addition, there is also UI to indicate the angle off the horizon and additional vertical crosshairs to ease alignment. When the device is not horizontal to the ground, the take picture button is disabled. As soon as the device is horizontal, the HorizontalOffsetView's EventListener gets a call to onAligned() and the button is enabled. Currently this is in debug mode and the number in each AlignmentCircle is the angle off the horizon it represents. Here are a few screenshots with the device aligned and unaligned.

Unaligned device

Unaligned device

Aligned device

Aligned device

As you can see from the preview, some work is still needed with the camera API in regards to focusing, because the pictures currently being taken are almost always blurry, unless the focal point is far away. I also need to add UI to indicate that a picture has been taken at a particular location.

While working with the app, I noticed a lot of sensor drift and I attempted to measure it. I first measured the sensor drift of the combined accelerometer and the magnetometer which together provide the direction the user is facing with respect to North. I collected 7,399 measurements over an approximately 1 minute period with the device facing North, so the actual measurement should be zero. The raw data is here. Here is a histogram showing my findings:

Compass sensor data histogram

Compass sensor data histogram

I also measured the sensor drift for the rotation vector which is used to measure the angle off the horizon. The device was face up on a table, so the true measurement should be 270 degrees. I took 4,293 measurements over an approximately two minute time period. The raw data is here. Here is a histogram showing my findings:

Rotation vector data histogram

Rotation vector data histogram

As you can see, the compass sensor has a lot more noise than the rotation vector , but both signals need to be filtered for noise.

I've also added a setup screen to the app that prompts for the name of the scene. A screenshot is available below:

Setup screen

Setup screen

Once a scene name is entered and the Capture Scene button is pressed, the user is taken to the main camera view screen. When a picture is taken, it is saved in the default pictures directory of the device in a directory title "Scene Recon". The file name is formatted as follows: NameOfScene_Roll_rollOfDeviceWhenPictureWasTaken_Pitch_pitchOfDeviceWhenPictureWasTaken.jpg Here is an example: Android_Roll_198.23932_Pitch_179.60252.jpg

Future Goals

  • Filter noise from sensors
  • Work with camera API to focus images
  • Create UI to indicate picture has been taken at a certain roll and pitch

3.6.2014

Current Progress

I've done more sensor drift measurements for the compass to determine if the distribution is really bimodal. My next two measurements were in a different location and only have a single peak. My first measurement was on my desk where there are many cables and wires that may have distorted the magnetic field. For the first of the new measurements, I had my device laying screen up on a table facing east. East is 90 degrees of North, but I use (360 - measuredAngle) so movement to AlignmentCircles is more intuitive. Therefore, the expected position is 270 degrees. The raw data for this measurement is here.

Compass Sensor Data Histogram [Expected = 270]

Compass Histogram Expected = 270

For my second measurement, I had my device laying screen up on a table facing South. The expected angle is 180 degrees. The raw data for this measurement is here.

Compass Sensor Data Histogram [Expected = 180]

Compass Histogram Expected = 180

For the app itself, I've created a UI to indicate whether a picture has been taken at a particular AlignmentCircle. If a picture has been taken, a camera icon will appear in the center of the AlignmentCircle. If not, the center of the AlignmentCircle will be empty. I have also added some text at the bottom of the screen to indicate how many AlignmentCircles the user has covered. The camera also continuously focuses so images are no longer blurry. A screenshot is below.

Updated UI

Updated UI


5.12.2014

Current progress

This update is more of a maintenance update. I've cleaned up the code and documented everything to make it easier to read. I've also update the UI once more to display the current roll, pitch, and yaw of the device.

Updated UI

Updated UI