GaugeCam began to investigate ways to actuate weirs, pumps, lighting and other equipment at their remote camera locations a couple of years ago. As part of the investigation, we purchased several Arduino micro-controllers. At the time, the NCSU BAE department used expensive PLC’s (programmed in ladder logic) to control their equipment out in the field. A low-end system typically cost them around $600. List price for an Arduino micro-controller with similar functionality costs between $35-80 (USD). The Arduinos we use have between 14 and 64 digital I/O, several microsecond timers, and an staggering array of optional add-ons. It is programmed in C, so there is not much of a learning curve to know how to use them.
We have a variety of micro-controllers that work with GaugeCam cameras and to control light position on some of our lab setups. When Dr. Birgand saw this, he wondered whether we could use the much less expensive micro-controllers to run some pumps and other equipment both in the lab and out in the field. He put one of the lab assistants to work learning to program the equipment and now, several months later, they have developed a couple of solutions that are more flexible, smaller, lighter, and more capable, not to mention much cheaper than the previous solution.
This is an example of one of several synergies that has occurred as a result of the collaboration between GaugeCam and the NCSU BAE department. Even though we do not plan to use Arduinos as actuators on our remote camera products (we are several months away from an exciting announcement in that regard), still, the research has contributed the effectiveness and capability of the lab in the performance of their on-going research.
All the equipment is up and running at the ASABE Conference in Louisville, Kentucky. We experienced the usual conference equipment and software hiccups, but Andrew, Troy, and François worked it all out. Everything is up and running as planned and the GaugeCam booth is experiencing a continuous flow of interested conference attendees.
GaugeCam’s demonstrates technologies developed to measure water levels in the wild for the BAE lab at North Carolina State University. We show live images arriving at the GaugeCam booth from a solar powered, cell-phone equipped Colorado Video camera of a water scene in a coastal North Carolina marsh. The results are presented in real-time on the internet as a graph of water levels coupled with archived images. There is a second live demo that features images arriving at a server in the booth from a Microseven camera pointed at a water column in the booth and presented, again, as a graph of the water level measurements coupled with an image archive.
GaugeCam’s purpose at this conference is to present technology that retrieves images and sensor data from remote sites, presents both raw and processed data automatically on the web and allows the user to control motors, weirs, lights, and other actuators either manually or based on sensor conditions and user specified logic via the internet. The feedback we receive from conference attendees will help us refine our current commercial offerings, identify consulting opportunities, and possible even identify new product offerings.
Still to come: Troy will present the results of the NCSU research based on data gathered with GaugeCam equipment.
The GaugeCam team has been putting in long days for the last couple of weeks to prepare the presentation Troy will deliver and the demonstration of our water column at the American Society of Biological and Agricultural Engineers (ASABE) conference in Louisville, KY on August 7-10, 2011. We will plan to show a version of the laboratory column used at our NCSU laboratory to demonstrate the capabilities of our remote camera and the image evaluation software to measure water height in the wild. Troy will describe our methods and our most important results in his presentation. Anyone who would like to get a copy of our desktop software is welcome to come by our booth to see the software in operation and talk to Andrew, Francois, and Troy. We were hoping to be able to hand out copies of the software at the conference, but are making last minute “ease of use” adjustments to the software and documentation so it will be more usable by other researchers. We plan to make the software available for download b August 19. Leave your email or snail mail with us (or just drop us a line here) and we will let you know where to go on the web as soon as the alpha release version of the software is available.
We are looking forward to seeing you there!
Troy has the GRIM (GaugeCam Remote IMage processor) program in his hands and is in the throes of final test. We plan to release it as soon as we get the documentation together and fix any last minute bugs. We are pretty pleased with how it turned out. There is one known issues, that is not a show stoppers. Troy found that if the image exhibits extreme fish-eye, the calibration is thrown off a little. This is not a problem with the calibration itself, rather it is a problem with the positioning of the fiducials in the scene. The solution is to use optics that minimize fish-eye. That is not too much of a challenge. We believe this software is sufficiently robust and accurate to provide the level of accuracy we require for the research (and refereed journal articles) we are writing. We will discuss the release schedule during our Saturday meeting and report back here.
TWO NEW PROJECTS FOR GAUGECAM!!!
Really Cheap GUI Programmable Microcontroller – The first is what we will call the Arduino controller project. One of François’s students has designed a system that moves water through a series of tubes for measurement by a spectroscope. The system needs to be flushed and water needs to be moved according to a very specific pattern. The student built a fairly sophisticated control system based on a Programmable Logic Controller (PLC). The problem is that several more of these systems are required and the PLC’s cost $750 each. I looked at the system and made the comment that I knew of a low cost (<$70) micro-controller that could replace the PLC. The microcontroller we have chosen is the Arduino Mega 256. It features 54 digital I/O (14 PWM) and 6 analog I/O. We could just write the program in the controller and give it to NCSU, but there is a better long term solution. GaugeCam would like to write an easy to use GUI based program for researchers to use to develop control code for their project without having to learn ladder logic for a PLC or C/C++ for a microcontroller. We will need to write a program that runs in the microcontoller and a program that runs in a Windows or Linux PC. I will lay that program out more fully in the coming weeks. François bought the microcontroller this week, so we should be able to start on the project as soon as we GRIM and a few other things off our plate. We are aiming at a <$100 solution including the development software. Tentatively, we are calling the new program GaugeCam Arduino Programmer (GAP).
Really Cheap Machine Vision System – This second project has been in the works for awhile. We made a lot of progress on it when it was called KamVu. For a number of reasons, we have brushed the program back off and started working on it again. The idea is to develop a full blown vision system that runs either in a Linux or Windows PC (netbook, desktop, or laptop) or on a BeagleBoard. With the BeagleBoard option, we believe it should be possible to do sophisticated machine vision for <$200 if you use a webcam and <$350 if you use an industrial machine vision camera with trigger inputs, strobe outputs, and an industrial housing. We are working with The Imaging Source on this project as they are the leading provider of inexpensive but capable industrial machine vision cameras. This vision system will be aimed at those who have a need for a cheap vision solution, but are not programmers. At least initially, the idea will not be to sell the systems, but to provide a free, probably open source program to develop applications and show people how to buy the system and put it together in a step-by-step fashion. We hope to be able to do our own Debian based Linux distribution specifically for the BeagleBoard so that a user can just install stuff, plug in a camera and start developing. We already have a Windows installer that discovers the digital I/O and cameras and is getting close to being able to develop simple presence/absence applications. Here is a screen shot of the application in its current state of development.
As we move toward the release of our beta software, I thought it would be a good time to reflect on our progress so far. As you would see if you cared to review this entire blog, we originally started with the idea that we could measure stream stage (water depth) using a camera. Why was this an attractive method when researchers and government agencies (USGS) already use a number of other methods, such as transducers and bubbler gauges? Well, from our collective experience, we know that field measurements are often erroneous due to instrument drift, infrequent or incorrect instrument calibration, or technician inexperience, just to name a few reasons. We felt the GaugeCam concept could address these error sources, while also providing a way to visually verify measurements.
After completing a brief proof of concept in the laboratory, we deployed a camera in the field near Pullen Park, Raleigh, NC. We chose this approach because we anticipated that the field application would involve many challenges we would never address in a lab-only study. We were correct! Our camera and communications system, which worked beautifully in the lab setting was not as robust in the field as we had hoped. We were able to compile a list of issues associated with our field application, which we have addressed in the beta version of our software. The field application gave us impetus to develop a functional daemon for processing images in real time on the GaugeCam server. Additionally, we were able to gather data for comparison with USGS stream stage data measured at Pullen Park.
While the Pullen Park deployment was underway, we stayed busy in the lab, assessing the capabilities of our camera and software. The camera was tested at a variety of distances and angles relative to the water level bench. We were encourage by the results but knew that to minimize the need for highly experienced technicians, we would need to automate our calibration process. To test the automated calibration, we have modified the water level bench using a white background with horizontal black bars substituting for water level. This was required to reduce the noise introduced by the water meniscus. Once the automatic calibration is verified, we will repeat our earlier study of water level detection from a variety of distance and angles. We will also deploy the system at alpha sites, which have already been identified. The transition from manual calibration to automatic calibration has been a little more difficult than anticipated (as seen in several recent posts). I feel we are encountering a typical challenge for machine vision projects; that the abilities of the human eye are very difficult to emulate using an algorithm!
The students in the NCSU Biogeochemistry for Ecological Engineering lab are just finishing a Theory of Drainage course. GaugeCam would have been very useful for recording accurate and repeatable liquid level measurements during this lab experiment!
The bi-weekly GaugeCam meeting went particularly well on Saturday. Troy and François made plenty of good suggestions about how I should improve the software and fix some bugs. One of the coolest ones was the addition of an overlay scale. It is not finished yet, but from the image below, you can get the general idea.
It has been a while since this event, but viewing the video reminded me of the power of imagery. To a seasoned hydrologist, a hydrograph of this event may have nearly the same impact, but for most of us the imagery is more stunning.
Andrew, François, and I all met in the lab this Saturday for our regular bi-weekly meeting of the GaugeCam team at the François’ NCSU BAE lab. We talked about a lot of things and were able to perform the first test of the automatic vision calibration technique. François measured the exact position of the calibration dots and the water level with the laser system he and Troy built for that purpose. We captured a couple of images of the test apparatus and calibration dots with the (really cheesy) webcam on my laptop. While we were at the lab, we got VERY good results.
I evaluated those images and some other ones when I got home and found that when the camera is not close to the level of the water, the calibration gets thrown slightly off due to viewing angle based dot distortion. We could do the math to back out those distortions, but it is much easier to change the shape of the fiducial. Currently we use a circular dot. Our short-term solution will be to change the fiducial shape to horizontal lines on a vertical ruler. That should work very well for the time being, but we will eventually need to go to a checkerboard calibration target and template matching to find the square intersections. I will discuss the benefits of such an approach when we get to that in a future version of the GaugeCam image processing software.
Andrew continued his work on the web interface/database elements of the software and we discussed some of the commercialization issues. GaugeCam plans to provide an Alpha version of the software to NCSU for one of their research projects. We are in the process of identifying 6-8 beta partners with whom we hope to work when a product offering is available. We hope the Alpha program will start sometime this fall with the Beta program to start in late spring or early fall.
A couple weeks ago, we noticed the following question and answer on WRAL.com. The full answer is found at the bottom of this post. If you click here or on the “lakes and rivers” link at the end of the post you’ll find additional interesting questions and answers about lake and river levels.
Answer: Lake levels are generally measured and reported by the U.S. Army Corps of Engineers, though there are a few exceptions. The lake levels that we pass along on our web site are given in feet, and the measurement is how far the lake surface is above mean sea level. For most lakes, this elevation is measured at a gauge station somewhere near one side of the lake. In some cases, there are multiple locations where the surface height is measured, and an average of those heights is reported as the measured lake level.
May. 31, 2010 | Tags: lakes and rivers