Here are some of the results of my work on the GaugeCam water level height measurement project.
The problem: We measure fine when the target is stable relative to the camera, but if the camera gets bumped, the target sinks down or tilts in the water, or anything else happens to alter the geometric relationship between the camera and the target, measurements come out wrong.
The solution: Adjust the calibration to accommodate some of the changes between the camera and the target. We cannot handle big changes, but we can handle those that most commonly occur in the field.
We thought you might like this image we from the tidal marsh camera on February 15, 2012 at 9:15 AM. We certainly did.
We have finally gotten around to working on the problem of handling motion between the camera and the calibration target. After has been calibrated, if the camera has moved, we need to make adjustments so our water-level search does not measure improperly. We have finished the first step in the process. If we know the nominal position of a calibration feature in an images such as that shown in figure 1, we can find the change in position of the target in a subsequent image such as is shown in figure 2. Figure 3 shows the reference image overlayed with the moved image. As can be seen they do not line up. Then, all we need to do is move the image in rotation and translation so the targets on the moved image are aligned with the reference image, then we can perform the search accurately as before. Figure shows the moved image overlayed on the reference image after the adjustment has taken place.
The next step will be to integrate this in to our GRIM software folllowed by integration into our web service.
Figure 1. Reference image
Figure 2. Target find of moved image
Figure 3. Unaligned images
Figure 4. Adjusted image overlayed on the reference image
Here is a brief video that describes the current state of the GaugeCam Remote Image (GRIM) software GUI. Most of these changes are “usability” changes, but there will be more changes in the near future to accomodate some of the advanced image processing techniques to improve on the already impressive suite that is currently in place.
Here are some images that have failed in the past, but that can now be found with our new (not yet released) waterline finding algorithm. We have some more work to do particularly on images with shadows in them, but we are definitely moving up the curve in terms of our ability to handle more difficult images. The last dirty images is particularly impressive. We will add a few more improvements to this, then start working on handling minor camera movements that cause problems for the find algorithm.
Dirty Image 1
Dirty Image 2
Dirty Image 3
Even though the blog has been fairly quiet for the last while, a lot of work has been performed to improve the capability of the GaugeCam water level measurement camera system. We will now start posting more frequently to discuss some of these improvements and talk about planned future improvements. There are three categories of improvements where we have made significant advancments. These are:
- The real-time web interface – This is the movement of images from the camera to the webserver, the application of the algorithm to create measurements, and the presentation of graphics and measurments on the internet. You can see some of the results of this work on this web page that shows the water level in a tidal marsh on the North Carolina coast as measured by one of our cameras. Andrew is responsible for our software systems and infrastructure.
- Camera, remote power, mounting, and target hardware – One of our hardest tasks is to develop a truly remote camera system that generates its own power, withstands the weather, provides its own light at night, is phsically stable, etc., etc. François is responsible for this in addition to his maintenance of the lab and our test cameras. Up until now, his improvements to the hardware have taken place behind the scenes, but expect to see a dramatically improved set of hardware on this blog in the very near future as we move to our first prototype camera production run.
- Vision algorithms – Up until the marsh camera was put into place and started shoveling images out to our web site, the requirements of the image processing software were not really well known because there were no images with which to work other than what we gathered in the lab. Therefore, we made our best best a what was required, wrote the algorithms and deployed them. They really work quite well, but now that we have a “real” and continuing stream of images, we know a lot more about what the vision algorithms will have to handle. I (Ken) am responsible for making the improvements to handle things like fog (See Image 1 below) and dirty high water marks (See Image 2 below). I will write about these and other improvements to the vision algorithms as they are developed and deployed.
Image 1. Fog
Image 2. High water line
The following is a video of our new auto-calibration capability. Previously it took a big effort to calibrate the GaugeCam water measurement software because it was necessary to specify a region of interest for each of the calibration target ficucials. In addition, the previous algorithm struggled with degraded images. This video demonstrates the new ease of use and calibration robustness even with bad images.
GaugeCam began to investigate ways to actuate weirs, pumps, lighting and other equipment at their remote camera locations a couple of years ago. As part of the investigation, we purchased several Arduino micro-controllers. At the time, the NCSU BAE department used expensive PLC’s (programmed in ladder logic) to control their equipment out in the field. A low-end system typically cost them around $600. List price for an Arduino micro-controller with similar functionality costs between $35-80 (USD). The Arduinos we use have between 14 and 64 digital I/O, several microsecond timers, and an staggering array of optional add-ons. It is programmed in C, so there is not much of a learning curve to know how to use them.
We have a variety of micro-controllers that work with GaugeCam cameras and to control light position on some of our lab setups. When Dr. Birgand saw this, he wondered whether we could use the much less expensive micro-controllers to run some pumps and other equipment both in the lab and out in the field. He put one of the lab assistants to work learning to program the equipment and now, several months later, they have developed a couple of solutions that are more flexible, smaller, lighter, and more capable, not to mention much cheaper than the previous solution.
This is an example of one of several synergies that has occurred as a result of the collaboration between GaugeCam and the NCSU BAE department. Even though we do not plan to use Arduinos as actuators on our remote camera products (we are several months away from an exciting announcement in that regard), still, the research has contributed the effectiveness and capability of the lab in the performance of their on-going research.
Hurricane Irene is scheduled to pass through North Carolina in the next day or so. GaugeCam is located in Raleigh. Raleigh is expected to receive one to two inches of rain and winds in the range of 30-40 mph. The tidal marsh camera could get hit hard. You can see from François’s previous post that it is directly in the hurricane’s path and could experience up to a foot of rain and 100+ mph winds. François sent his team out to the marsh to temporarily remove the camera. Thanks for the help, guys!
CORRECTION!: Troy just popped me a note to let me know the team has MOVED the camera to a higher location in the marsh where it will weather the storm. It will be interesting to see how all this works out.
All the equipment is up and running at the ASABE Conference in Louisville, Kentucky. We experienced the usual conference equipment and software hiccups, but Andrew, Troy, and François worked it all out. Everything is up and running as planned and the GaugeCam booth is experiencing a continuous flow of interested conference attendees.
GaugeCam’s demonstrates technologies developed to measure water levels in the wild for the BAE lab at North Carolina State University. We show live images arriving at the GaugeCam booth from a solar powered, cell-phone equipped Colorado Video camera of a water scene in a coastal North Carolina marsh. The results are presented in real-time on the internet as a graph of water levels coupled with archived images. There is a second live demo that features images arriving at a server in the booth from a Microseven camera pointed at a water column in the booth and presented, again, as a graph of the water level measurements coupled with an image archive.
GaugeCam’s purpose at this conference is to present technology that retrieves images and sensor data from remote sites, presents both raw and processed data automatically on the web and allows the user to control motors, weirs, lights, and other actuators either manually or based on sensor conditions and user specified logic via the internet. The feedback we receive from conference attendees will help us refine our current commercial offerings, identify consulting opportunities, and possible even identify new product offerings.
Still to come: Troy will present the results of the NCSU research based on data gathered with GaugeCam equipment.