We've made good progress on the ranging LIDAR prototype over the last week and here's a video of it installed on a modified Companion Bike Seat. The LIDAR sensor is sitting on a servo to expand the range of "visibility" of the sensor by scanning back and forth. It's not incredibly clear from the video (LEDs and video don't play well together); this bicycle radar prototype is delivering information back to the LED board from eight discrete positions. Each vertical line on the LED board represents one of those sensor positions. The distance an object is from the LIDAR sensor is displayed by the number of LEDs lit in that vertical line. The closer the object is, the more LEDs light up. The LED board for this functional prototype is set up such that it creates a two-dimensional display of what is going on in view of the sensor, based on those eight discrete positions.
I'm bringing my bike down to Radicand tomorrow afternoon to have them install this latest prototype of Velalert onto it. If all goes planned, I'll be testing out this latest proof-of-concept bicycle radar next weekend!
The sensor (on the servo) is shown in the photo above. The first video below shows the LED board lighting up as the car approaches, with the sensor stationary. The second video shows the servo in action, with the LED board showing three distinct data sets, one from each position of the sensor.
Next, we're going to install this latest proof-of-concept into a Companion Bike Seat (which ends up being perfect for this prototype) and take it out for some road-testing!
Based on the results with the first prototype, we decided to shift focus to the LIDAR sensor. It has a much longer range and a lot more accurate visibility than the ultrasonic sensor. The challenge is that it's a very narrow beam, and I want to provide a broader range of visibility for the rider than a single point directly behind. Rather than adding more expensive sensors to the system, we aim to solve this problem using only one LIDAR sensor. This is a video of the first explorations into expanding the broadness of visibility of the sensor, in this case attaching the LIDAR onto a servo pointing in three different directions.
Remember Johnny Five from the movie Short Circuit? The resemblance is striking I'd say ...
The challenge here is being able to turn the data received from the sensor into a two-dimensional picture based on the known timing of the shifts in position on the servo. Once we're able to successfully do that with this version taking three different angles, we can extrapolate this into the proper number of positions for the sensor to do a complete sweep across a broad enough angle to provide adequate range in visibility and very clean data for Velalert.