Silicon ChipTechno Talk - That makes so much sense! - August 2024 SILICON CHIP
  1. Contents
  2. Subscriptions
  3. Back Issues
  4. Publisher's Letter: Exciting News!
  5. Feature: Techno Talk - That makes so much sense! by Max the Magnificent
  6. Feature: Net Work by Alan Winstanley
  7. Project: LC and ESR Meter by Steve Matthysen
  8. Project: WebMite by Geoff Graham & Peter Mather
  9. Project: WebMite-based Watering System Controller by Geoff Graham
  10. Feature: Circuit Surgery by Ian Bell
  11. Feature: Audio Out by Jake Rothman
  12. Feature: Max’s Cool Beans by Max the Magnificent
  13. Feature: Teach-In 2024 – Learn electronics with the ESP32 by Mike Tooley
  14. Market Centre
  15. Advertising Index
  16. PartShop
  17. Back Issues

This is only a preview of the August 2024 issue of Practical Electronics.

You can view 0 of the 72 pages in the full issue.

Articles in this series:
  • (November 2020)
  • Techno Talk (December 2020)
  • Techno Talk (January 2021)
  • Techno Talk (February 2021)
  • Techno Talk (March 2021)
  • Techno Talk (April 2021)
  • Techno Talk (May 2021)
  • Techno Talk (June 2021)
  • Techno Talk (July 2021)
  • Techno Talk (August 2021)
  • Techno Talk (September 2021)
  • Techno Talk (October 2021)
  • Techno Talk (November 2021)
  • Techno Talk (December 2021)
  • Communing with nature (January 2022)
  • Should we be worried? (February 2022)
  • How resilient is your lifeline? (March 2022)
  • Go eco, get ethical! (April 2022)
  • From nano to bio (May 2022)
  • Positivity follows the gloom (June 2022)
  • Mixed menu (July 2022)
  • Time for a total rethink? (August 2022)
  • What’s in a name? (September 2022)
  • Forget leaves on the line! (October 2022)
  • Giant Boost for Batteries (December 2022)
  • Raudive Voices Revisited (January 2023)
  • A thousand words (February 2023)
  • It’s handover time (March 2023)
  • AI, Robots, Horticulture and Agriculture (April 2023)
  • Prophecy can be perplexing (May 2023)
  • Technology comes in different shapes and sizes (June 2023)
  • AI and robots – what could possibly go wrong? (July 2023)
  • How long until we’re all out of work? (August 2023)
  • We both have truths, are mine the same as yours? (September 2023)
  • Holy Spheres, Batman! (October 2023)
  • Where’s my pneumatic car? (November 2023)
  • Good grief! (December 2023)
  • Cheeky chiplets (January 2024)
  • Cheeky chiplets (February 2024)
  • The Wibbly-Wobbly World of Quantum (March 2024)
  • Techno Talk - Wait! What? Really? (April 2024)
  • Techno Talk - One step closer to a dystopian abyss? (May 2024)
  • Techno Talk - Program that! (June 2024)
  • Techno Talk (July 2024)
  • Techno Talk - That makes so much sense! (August 2024)
  • Techno Talk - I don’t want to be a Norbert... (September 2024)
  • Techno Talk - Sticking the landing (October 2024)
  • Techno Talk (November 2024)
  • Techno Talk (December 2024)
  • Techno Talk (January 2025)
  • Techno Talk (February 2025)
  • Techno Talk (March 2025)
  • Techno Talk (April 2025)
  • Techno Talk (May 2025)
  • Techno Talk (June 2025)
Articles in this series:
  • Win a Microchip Explorer 8 Development Kit (April 2024)
  • Net Work (May 2024)
  • Net Work (June 2024)
  • Net Work (July 2024)
  • Net Work (August 2024)
  • Net Work (September 2024)
  • Net Work (October 2024)
  • Net Work (November 2024)
  • Net Work (December 2024)
  • Net Work (January 2025)
  • Net Work (February 2025)
  • Net Work (March 2025)
  • Net Work (April 2025)
Articles in this series:
  • Circuit Surgery (April 2024)
  • STEWART OF READING (April 2024)
  • Circuit Surgery (May 2024)
  • Circuit Surgery (June 2024)
  • Circuit Surgery (July 2024)
  • Circuit Surgery (August 2024)
  • Circuit Surgery (September 2024)
  • Circuit Surgery (October 2024)
  • Circuit Surgery (November 2024)
  • Circuit Surgery (December 2024)
  • Circuit Surgery (January 2025)
  • Circuit Surgery (February 2025)
  • Circuit Surgery (March 2025)
  • Circuit Surgery (April 2025)
  • Circuit Surgery (May 2025)
  • Circuit Surgery (June 2025)
Articles in this series:
  • Audio Out (January 2024)
  • Audio Out (February 2024)
  • AUDIO OUT (April 2024)
  • Audio Out (May 2024)
  • Audio Out (June 2024)
  • Audio Out (July 2024)
  • Audio Out (August 2024)
  • Audio Out (September 2024)
  • Audio Out (October 2024)
  • Audio Out (March 2025)
  • Audio Out (April 2025)
  • Audio Out (May 2025)
  • Audio Out (June 2025)
Articles in this series:
  • Max’s Cool Beans (April 2024)
  • Max’s Cool Beans (May 2024)
  • Max’s Cool Beans (June 2024)
  • Max’s Cool Beans (July 2024)
  • Max’s Cool Beans (August 2024)
  • Max’s Cool Beans (September 2024)
  • Max’s Cool Beans (October 2024)
  • Max’s Cool Beans (November 2024)
  • Max’s Cool Beans (December 2024)
Articles in this series:
  • Teach-In 2024 (April 2024)
  • Teach-In 2024 (May 2024)
  • Teach-In 2024 – Learn electronics with the ESP32 (June 2024)
  • Teach-In 2024 – Learn electronics with the ESP32 (July 2024)
  • Teach-In 2024 – Learn electronics with the ESP32 (August 2024)
  • Teach-In 2024 – Learn electronics with the ESP32 (September 2024)
  • Teach-In 2024 – Learn electronics with the ESP32 (October 2024)
  • Teach-In 2024 – Learn electronics with the ESP32 (November 2024)
That makes so much sense! Techno Talk Max the Magnificent I’m always amazed by the cunning creations of the pioneers of yesteryear, especially when I consider the rudimentary sensors they had at their disposal. I often wonder what their reactions would be to see the sophisticated sensors we have available to us today. A s usual, my poor old noggin is full of random thoughts bouncing around like super balls on steroids. The topic that has currently captured my attention is that of sensors. What pops into your mind when you hear (or see) the word ‘sensor’? Living in an age of wonders as we do, you may be thinking of highfalutin’ devices like lidar (laser detection and ranging) or radar (radio detecting and ranging) . When we boil things down, however, a sensor is any device that detects some physical phenomenon and produces a corresponding output signal. For our purposes here, we will assume electrical output signals in terms of voltage or current being used to feed electrical or electronic systems, but this isn’t cast in stone. Victorian fax machines Can you imagine the Victorians sending faxes to each other? This may seem farfetched, but in 1842, a Scottish engineer and inventor called Alexander Bain came up with a cunning idea. He created an image to be transmitted by cutting it out of a thin sheet of tin. He placed this metal image on a movable insulated base and connected it to one side of a battery. The base was slowly passed under a swinging pendulum formed from a conducting wire with a weighted point on the end. Whenever this point connected with the metal image, it completed the electrical circuit, thereby converting the dark and light areas of the image – which were represented by the presence or absence of tin – into an electrical signal. This cunning creator used this electrical signal to activate a relay attached to the end of another pendulum that was swinging back and forth over a second moving bed. The activated relay caused an attached pencil to encounter a piece of paper laying on the moving bed, thereby reproducing the original image in metal as a drawing in pencil. Ray guns and TV controllers Did you ever see the original Buck Rogers science fiction serial from 1939 starring Buster Crabbe? Filmed in glorious blackand-white, this was originally released as 8 a series of 20-minute movies, and later shown in the 1950s and 1960s on TV. Buck sported a ray gun in the form of a ‘U-235 Atomic Pistol’. The reason I mention all this is that the first practical photoelectric cells were invented in the 1880s. In 1955, Zenith introduced the world’s first wireless television remote control called the Flash-Matic. Looking like something Buck Rogers would not be ashamed to be seen carrying, this glorified torch (flashlight) employed a beam of light to activate four photocells located at the corners of the screen, thereby allowing the user to control the volume and channel selection. Mobile sensor platforms Do you remember the artificial intelligence (AI) called KITT (Knight Industries Two Thousand) powering the highly advanced, very mobile, robotic automobile in the Knight Rider TV series of the 1980s? Today’s cars are getting close to (sometimes they surpass) KITT’s capabilities. My own 2019 Subaru Crosstrek is equipped with binocular vision that can be used to detect and correct any drifting out of lane, vary the speed of the cruise control if we get too close to a car in front, and slam on the breaks if it feels we are in danger of imminent collision. All of this is made even more exciting by my wife screaming in my ear. In fact, today’s autonomous cars and robots are essentially mobile sensor and computing platforms. A very common scenario is to have multiple cameras equipped with CMOS sensor arrays that are sensitive to light in the visible part of the spectrum. These feed advanced processors running AI algorithms that can perform tasks like object detection and recognition. These cameras can be augmented by lidar and radar sensors. The original lidars were big, bulky, and expensive, but new versions are coming online in which almost everything is implemented in semiconductor form. As opposed to a simple time-of-flight (TOF) approach which involves generating powerful pulses of light and measuring the round-trip time of any reflections, companies like SiLC Technologies are using a frequency modulated continuous wave (FMCW) approach that can provide distance and velocity data on a pixel-by-pixel basis, allowing them to perceive and identify objects more than a kilometer away. Meanwhile, companies like Owl Autonomous Imaging are creating longwave infrared (LWIR) thermal focal plane arrays (thermal imagers). The signals from these images can be employed by AI to perform object detection, classification and ranging. As the folks at Owl told me, ‘Within five years, all new cars will be able to see at night!’ That’s deep! Have you ever thought about our amazing ability to perceive the world around us in three dimensions? Powered by our optical sensors (eyes) and associated computers (brains), we call this ability ‘depth perception.’ There are many aspects to this, but we start with the fact that each of our eyes sees a slightly different image due to their separation in our heads. The resulting disparities are processed in the visual cortex of our brains to yield depth information. Even with one eye closed, we can still do things like track and catch a ball heading our way. In this case, our brains make use of visual cues, including knowing how big we expect objects to be, and our understanding that if an object appears to be growing bigger, then this may be a good time to duck. In the case of machine vision, one of the components of depth perception is the ability to create a 3D depth map (point cloud) of the scene. We can do this using two CMOS sensors to provide binocular vision, but that increases the cost. We can employ a single CMOS sensor in conjunction with an AI, using its understanding of the scene to determine where and how big things are in 3D space, but this requires a lot of computation. The folks at a company called AIRY3D have come up with a way to use a single CMOS sensor to generate both a regular 2D image and a 3D point cloud on a pixel-by-pixel basis with very little computation. I don’t know about you, but I certainly didn’t see this one coming! Practical Electronics | August | 2024