Based on the results with the first prototype, we decided to shift focus to the LIDAR sensor. It has a much longer range and a lot more accurate visibility than the ultrasonic sensor. The challenge is that it's a very narrow beam, and I want to provide a broader range of visibility for the rider than a single point directly behind. Rather than adding more expensive sensors to the system, we aim to solve this problem using only one LIDAR sensor. This is a video of the first explorations into expanding the broadness of visibility of the sensor, in this case attaching the LIDAR onto a servo pointing in three different directions.
Remember Johnny Five from the movie Short Circuit? The resemblance is striking I'd say ...
Remember Johnny Five from the movie Short Circuit? The resemblance is striking I'd say ...
|
The challenge here is being able to turn the data received from the sensor into a two-dimensional picture based on the known timing of the shifts in position on the servo. Once we're able to successfully do that with this version taking three different angles, we can extrapolate this into the proper number of positions for the sensor to do a complete sweep across a broad enough angle to provide adequate range in visibility and very clean data for Velalert.