Equipping machines with extrasensory perception – EEJournal

0

I was just browsing and thinking about the official definition of extrasensory perception (ESP). This is also called the “sixth sense” based on the fact that most people think that we (humans) are equipped with only five of the little rascals: sight, hearing, touch, taste and smell. .

By the way, we actually have many more senses at our disposal, including thermoception (the sense by which we perceive temperature*), nociception (the sense by which we perceive pain in our skin, our joints and internal organs), proprioception (the sense of the relative position of parts of one’s body), and equilibrioception (the part of our sense of balance provided by the three tubes of the inner ear) . (*Even if you are blindfolded, if you hold your hand near something hot, you can feel the heat in the form of infrared radiation. On the contrary, if you hold your hand above something cold, you can detect the lack of heat. ) But we digress…

Going back to ESP, the official definition goes something along the lines of “claimed receipt of information not acquired through the recognized physical senses, but rather felt with the mind”. Alternatively, Britannica’s website defines ESP as “a perception that occurs independent of known sensory processes”. Typical examples of ESP include clairvoyance (supernormal awareness of objects or events not necessarily known to others) and precognition (knowledge of the future). In fact, I may have a touch of precognition myself, as I can sometimes see a limited path to the future. For example, if my wife (Gina the Magnificent) ever discerns the true purpose of my prediction engine, I anticipate that I will no longer need it to predict her current mood.

What about senses that we don’t have but that exist in other animals, such as the fact that some creatures can sense the earth’s magnetic field, while others can perceive natural electrical stimuli? Wouldn’t it be appropriate to classify these abilities as extra-sensory perception in relation to our own sensory experience?

In the case of humans, one of our most powerful and widespread senses is that of sight. As we discussed in a previous column – Are TOM displays the future of consumer AR? — “Eighty to eighty-five percent of our perception, learning, cognition, and activities go through vision” and “More than 50 percent of the cortex, the surface of the brain, is devoted to visual information processing.

In my Evolution of color vision paper, I mention that the typical human eye has three different types of conical photoreceptors that require bright light, that allow us to perceive different colors, and that provide what are called photopic vision (the sensitivity of our cones peaks at wavelengths of 420, 534 and 564 nm). We also have rod photoreceptors which are extremely sensitive to low levels of light, which cannot distinguish between different colors and which provide what is known as scotopic vision (the sensitivity of our rods peaks at 498 nm).

Typical humans have three types of cone photoreceptors (think “color”) as well as rod photoreceptors (think “black and white”)
(Image source: Max Maxfield)

There are several things to note about this illustration, including the fact that I’m pretty proud of this little rascal, so please feel free to say good things about it. Also, I normalized the curves on the vertical axis (i.e. I drew them such that they all have the same maximum height). In reality, rod cells are much more sensitive than cone cells, in that attempting to contrast their sensitivity on the same graph would result in a single sharp peak for rods contrasting with three slight bumps for cones. The bottom line is that rods and cones just don’t play together under the same lighting conditions. Cones require bright light to function, while rods are saturated in a bright light environment. In comparison, in low light conditions when the rods come into their own, the cones turn off and provide little or no useful information.

As an aside, people who don’t believe in evolution often use the human eye as an example of something so sophisticated that it must have been created by an intelligent designer. I think it’s fair to say that I count my eyes among my favorite organs, but you also have to recognize that there are fundamental design flaws that wouldn’t exist if the engineers had been in charge of the project.

When it comes to how something as awesome as the eye could have evolved, I heartily recommend two books: Life’s Ratchet: How molecular machines extract order from chaos by Peter M. Hoffmann and Wetware: a computer in every living cell by Denis Bray. Also in this videothe British evolutionary biologist Richard Dawkins gives a good overview of the possible evolutionary development path of the eyes in general.

I don’t know about you, but I’m extremely impressed with the wealth of information my eyes give me. From a certain point of view (no pun intended), I’m riding the crest of the wave that represents the current peak of human evolution, like you of course, but this is my column, so let’s stay focused (one more times, no pun intended) on me. On the other hand, I have to admit I was a little chagrined when I first found out that the humble mantis shrimp (which isn’t really a shrimp or a mantis) has the most complex eye known in the animal kingdom. As I noted in my previous Evolution of color vision paper: “The intricate details of their visual systems (three different regions in each eye, independent movement of each eye, trinocular vision, etc.) are too numerous and varied to discuss here. Suffice it to say that scientists have discovered certain species of mantis shrimp with sixteen different types of photoreceptors: eight for light in (what we consider to be) the visible part of the spectrum, four for ultraviolet light, and four for the analysis of polarized light. In fact, it’s said that in ultraviolet alone, these little rascals have the same ability that humans have in normal light.

What? Sixteen different types of photoreceptors? Are you seriously telling me that this doesn’t count as extrasensory perception in any way?

The reason I’m talking about all of this here (yes, of course there’s a reason) is because I was talking to Ralf Muenster, who is vice president of business development and marketing at SiLC Technologies. The tagline on SiLC’s website reads, “Helping machines see like humans.” As we will see, however, in some respects they provide sensing capabilities that supersede those offered by traditional vision sensors, thus providing the machine equivalent of extrasensory perception (Ha! Unlike you, I was sure that – just like one of my mum’s torturous tales – I’d end up bringing this point home).

Ralf pointed out that today’s AI-based perception systems for machines like robots and automobiles mostly feature cameras (sometimes single camera modules, sometimes two-module setups to provide binocular vision ). Ralf also noted that, much like the human eye, traditional camera vision techniques only look at the intensity of photons that hit their sensors.

These vision systems, which are great for things like detecting and recognizing road markings and signs, are usually supplemented with some form of lidar and/or radar capability. The interesting thing for me was when Ralf explained that the lidar systems that are currently used in automotive and robotics applications are primarily based on a time-of-flight (TOF) approach in which they generate powerful pulses of light and measure time round trip. of all thoughts. These systems use huge amounts of power – say 120 watts – and the only reason they are considered “eye safe” is because the pulses they generate are so short. Also, when they see a pulse of reflected light, they don’t know if it’s their pulse or someone else’s. And, just to add to the fun and frivolity, since light travels about 1 foot per nanosecond, every nanosecond you’re gone can equate to being a foot closer to a potential problem.

That’s when Ralf playfully pulled out his Eyeonic vision sensor, so to speak. The Eyeonic features frequency modulated continuous wave (FMCW) lidar. Indeed, Ralf claims it is the most compact FMCW lidar ever made and is several generations ahead of any competitor in the level of optical integration it deploys.

The Eyeonic vision sensor (Image source: SiLC)

But what exactly is an FMCW lidar when at home? Well, first of all, the Eyeonic sends out a continuous laser beam at a much lower intensity than its pulsed TOF cousins. It then uses a local oscillator to mix any reflected light with the light generated by its coherent laser emitter. Using highly intelligent digital signal processing (DSP), depth, velocity, and polarization-dependent intensity can be instantly extracted while remaining completely immune to environmental and multi-user interference. Additionally, the combination of polarization and wavelength information facilitates surface analysis and material identification.

Again, the material identification aspect of this led me to consider creating bots that could target people wearing 1970s style polyester golf pants and teach them the error of their ways , but then I decided that my dear old mother would be appalled. by that train of thought, so I quickly de-thought it.

Terms of Eyeonic vision sensors (Image source: SiLC)

It is difficult to overstate the importance of the speed data provided by the Eyeonic vision sensor because the ability to detect movement and speed is essential when it comes to performing visual cognition. By providing direct instantaneous motion/speed information per pixel, the Eyeonic vision sensor facilitates rapid detection and tracking of objects of interest.

Of course, the FMCW-based Eyeonic vision sensor is wrong. to replace traditional camera vision sensors, which will continue to evolve in terms of resolution, and the artificial intelligence systems that employ them. On the other hand, I think traditional TOF lidar-based systems should start looking for the “exit” door. I also believe that combining camera-based vision systems with FMCW-based vision systems will open doors to new capabilities and possibilities. I can’t wait! What you say?

Share.

Comments are closed.