About Primordial Android

We started Primordial Android to learn about mobile applications and the Android operating system.  Our first applications have scratched personal itches more than solved particular problems, so we were free to explore the programming environment.  With this knowledge we're looking forward to more development projects with customers.  Contact us at sales@primordand.com if you have an application you'd like to develop.

 

About PMVS

Primordial Machine Vision Systems was started in 1993 to use our expertise in sensor hardware and image processing for general imaging problems.

 

The Retina Imager.  Log-polar imaging arranges the pixels of a sensor in a distinctive pattern.  The circular grid means that rotations of objects about the center point are simple linear shifts in the image.  The logarithmic increase in the ring size means that scaling is also a linear shift.  More importantly, the larger pixels also decrease the data coming off the sensor, which makes for faster image analysis.  The sensor is meant to be used in an active environment, where the system shifts its attention and the high-resolution part of the image to interesting objects, while monitoring the periphery for changes that it needs to respond to.  The human visual system, whose resolution falls off outside the fovea and which uses saccades to scan a scene and focus on particular objects, was the model for the sensor.  The Retina imager was done at the University of Pennsylvania by Greg Kreider and Jan van der Spiegel.  It contains a custom CCD sensor with a log-polar layout, a clock driver ASIC, and driving electronics.  We continued to work with the imager.

Retina imager
 
 
fringe measurement
 

fringe Flatness Measurement Program.  The customer needed an outgoing inspection for a part to guarantee that its flatness was within a 10 micron spec over an area of 7 cm x 9 cm.  Our solution used a table interferometer with custom software.  The interferometer generates dark and white bands, one pair per wavelength of the laser light.  The program identified the bands and corrected for imperfections in the image such as dust, figured out how they were connected and where the high and low points were, and determined the greatest height difference.  The program was accurate to 1 micron and limited by the resolution of the camera.  The software generated a measurement report for the end customer and was integrated in the customer's production environment, including operator training, data storage, calibration, and maintenance.

 

Time of Flight Technologies.  Time of Flight measures the distance to objects by timing how long light takes to travel to the target and back.  1 nanosecond delay is about a foot.  For this project we looked into how to perform the measurement using CCD and CMOS sensors at normal frame rates and resolutions (many Time of Flight sensors use modified technologies and have fairly low resolution).  We created a model of the system from the light source to sensor read-out, including noise throughout the sensor, to guide the design.  For example the picture at left shows the error in the measurement as the distance increases and where it fails.  We modified sensor designs and cameras, including firmware changes for timing, and successfully characterized the performance of the systems to demonstrate the technical feasibility of these approaches.

ToF Simulation
 
 
Object extraction
 

Real-Time Object Extraction Program.  Despite the power of modern processors, the data throughput of a megapixel sensor is still too high to do much image processing at frame rates.  Suppose you want to extract an arbitrary object from a random background given conditions such as how far away it is and how big it is.  We tackled this problem but coupling two cameras, one for taking an image and one for the extraction (in this case, a low resolution Time of Flight sensor).  The system breaks the analysis into successive layers that run independently as fast as they can, where we reduce the image data at each stage: object detection, extraction from scene, analysis, and modification.  It runs on a high-end desktop system at frame rates in a five-stage pipeline (five frame delay from input to result).

 

We offer an unique, broad skill set from silicon to software.  We have extensive experience with sensor design, including simulation, layout, modeling, and characterization.  On the image processing and software side our tools include C and Tcl/Tk, Java, Clojure, and now Chapel.  Our preferred development platform uses Linux or Android.   We follow professional, rigorous processes in our projects that shows in the quality of what we deliver.

 

We're located in New Hampshire's Monadnock Region in the southwestern part of the state.  The best way to contact is by e-mail.