Comments on an LDROBOT LD-Air Lidar

I purchased an LD-AIR LIDAR from LDROBOT as part of a Kickstarter campaign. I received it some time ago but did not have any time to play with it then. I just got some time this week and these are my notes.

The unit looks and fells nice. It is not a chintzy unit. The molds for the parts all look very well made. The layout seems good and the size is great. It is not much bigger than a golf ball.When I purchased it, I did not buy the USB to serial adapter. By this point I must have a dozen. What I failed to realize was that with the adapter you also get the cable that plugs into the unit. So job one was to track down the connector and make/buy a cable.

What is lacking is documentation. There is a datasheet with things like scan rate, range, accuracy, wavelength, pinout, and such things, but nothing on the connector. I posted to their forum and got a reply with the type of cable. An Amazon purchase got me these JST ZH 1.5MM 4 Pin Cables

So after a few minutes of soldering, I got a cable to power up the unit. Now the lack of documentation meant I had no idea the format of the data nor the baud rate. I got the baud rate from an oscilloscope (230400). I spent some time trying to reverse engineer the format but with little luck. My first break came when I looked at the documentation for the company’s other LIDAR units. It would appear that they used the same format. The data comes in packets of 12 measurements. Each measurement has a header followed by an array of measurements followed by a footer.  The C structures I used to decode are:

#pragma pack(1)
typedef struct              // 6 bytes
{
  uint8_t  som;           // Start of message (0x54)
  uint8_t  size;          // Low 5 bits is the number of measurements (12)
  uint16_t speed;         // Rotation speed
  uint16_t start;         // Start angle in units of 0.01 degrees
 } Ld19Hdr_t;
#pragma pack(0)

#pragma pack(1)
typedef struct              // 3 bytes * 12 = 36
{
  uint16_t  dist;         // Distance in mm
  uint8_t   confidence;   // Confidence value. Higher the better
} Ld19Meas_t;
#pragma pack(0)

#pragma pack(1)
typedef struct              // 5 bytes
{
  uint16_t  end;          // End angle in units of 0.01 degrees
  uint16_t  time;         // Time stamp in msec
  uint8_t   crc;          // CRC of bytes from SOM to time
} Ld19Footer_t;
#pragma pack(0)

#pragma pack(1)
typedef union
{
  struct
  {
    Ld19Hdr_t    hdr;      // Header
    Ld19Meas_t   body[12]; // Body
    Ld19Footer_t footer;   // Footer
  
  } msg;
     uint8_t  buf[47];          // Array of bytes
} Ld19Msg_t;
#pragma pack(0)

 

I then wrote a simple Python program to record the data to a hex file and another one to decode the data and generate a CSV file as well as a series of images for each 360 degree sweep of the LIDAR. One small wrinkle is that the number of packets needed for a full sweep is not necessarily an integral number. You may change from 359+ degrees to 0 degrees in the middle of a packet of 12 measurements.

I then took the series of images and combined them into a movie using ffmpeg. The visual representation makes it easy to visualize the large amount of data collected.

The next step was to do the data collections using a microcontroller. My current unit of choice is the Teensy 4.1. It is ridiculously fast and has a built in microSD card for data storage. The downside is it does not have WiFi or Bluetooth transceivers.

I wired up the LIDAR to the Teensy as well as an IR optical reflection sensor. A 3D printed part held the LIDAR,  Teensy, a small power bank for power and the IR sensor. This was all mounted on top of a LEGO train car that ran on about 8 feet of straight train track. The IR sensor was mounted in such a way as to see the railroad ties. The contrast was enhanced with a small white 1*1 tile on each tie. Now as the train moved, I could also record the time that I saw each railroad tie pass underneath. Knowing that the ties are exactly 32 mm apart allowed me to know exactly where the LIDAR unit was during the motion path.

My next steps are to continue to understand the limitations of the unit and to explore SLAM.

 

This entry was posted in AVC, LD-AIR, Personal, Teensy, Uncategorized. Bookmark the permalink.