Skip to content

This project contains the code used in my thesis, "IAmMuse: A signal-processing-based method to estimate arm positions with mmWave radar".

License

Notifications You must be signed in to change notification settings

BrendanMesters/IAmMuse

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

IAmMuse: A signal-processing-based method to estimate arm positions with mmWave radar

The following project is the code used in my Masters Thesis, on interpreting sparse mmWave radar pointclouds for the use in human computer interaction, through a musical application.

For this project we used the IWR6843ISK mmWave radar by Texas Instruments.

In the thesis, we explored the possibility of non-deep-learning interpertation algorithms for millimeter wave radar pointclouds. We did this through a benchmark application which interpreted the users arms as being either low, middle or high. Combining these 3 positions per arm, for a total of (3x3=) 9 possible position states, we emit midi notes, in accordance with where the users arms are.

The whole pipeline contains an interface with the mmWave radar (through implementing the TLV interface), filtering out noise, interpreting the data, and sending it through to a visualizer (implemented in python) and a musical note generator.

If you plan to explore the code, please first look through the main.rs file, to get an idea of the various parts which make up the system. Each part is ran as a seperate tokio system, and they communicate through message channels and broadcasts.

PREREQUISITS

In order to communicate with the FMCW, there are some things you have to do first.

Communicating with the device

On linux

You need to be added to the dialout goup, this is done slightly differently depending on the OS you run, just google it. You need to make the device location of the fmcw accessible by users (most likely /dev/ttyUSB0 and /dev/ttyUSB1) with sudo chmod 666 /dev/ttyUSB0 && sudo chmod 666 /dev/ttyUSB1

On Windows

If I remember correctly, you have to download and install the driver of Texas Instruments, this should allow you to see the mmWave radar in your devices as a COM device (or two devices, one for sending data to, and one for receiving data from). From here on out, you should be able to communicate with the device

On Mac

I would not know, I hope that Texas Instruments has decent support and instructions for you guys

Using the right config

Note that the file settings.toml refers to the ports used for data and config, these are different between linux and windows (and possibly mac). Please make sure you use the appropriate one.

Firmware

Its important to have the right firmware installed. I used the Mobile Tracking firmware, which can be found at radar_toolbox_3_00_00_05/source/ti/examples/Industrial_and_Personal_Electronics/Robotics/Mobile_Tracker/prebuilt_binaries/mobile_tracking

Running the code.

The rust side

The code is simply ran as one would any rust project through cargo run --release, note that the --release flag is very important for good performance. You can also build an executable through cargo build --release, which can be found in the target folder.

If you want to replay a specific recording you made previously, you can do so by appending the directory name to the run command cargo run --release $RECORDING_DIR. If you want to replay a recording containing only mmWave data, simply refer to that folder. If you want to replay a recording which also contains kinect ground truth data, simply refer to the top level directory (so the one containing both an mmwave folder and a kinect.csv file).

Note that the code does expect to be ran from project root. If you try and run the code from anywhere but project root you should be warned that this will not work.

By default the recording will be saved into a date-stammped sub-folder in frame_recordings.

The mmWave configuration file can be changed by modifying the CONFIG_FILE variable in src/main.rs. e.g. static CONFIG_FILE: &str = "./my_new_config_file.cfg"; Note, that the . in this case should refer to the project home.

The python side

The folder python_part contains the code used for viusalization, and the easiest way to run this code is through the ./visualizer.fish script, although this does require you to have the fish shell installed. Translating the script to bash should be relatively easy, the main parts consist of activating the virtual environment, and running the code.

NOTE: If you start the rust code, before the visualizer has printed "Python server listening..." then the two won't be connected properly and you won't receive a working visualizer.

Making sound

Windows

On windows, I found that some system in windows captures the Midi messages send out into the aether, and generates sounds from it. This system does repeatedly "hit" the note, as opposed to sustaining it, which is semi annoying, but it seems to work "out of the box"

Linux

On linux I used the fundsp audio synthesizer project. Specifically, I compiled the live_adsr example, and ran that (through the minimal ./adsr.sh script).

Please take care to compile the fundsp code with the --release flag, otherwise it may not work well.

Collecting ground truth.

To record the mmWave data and collect the ground truth, you need to run the code in the following git repo. To generate the combined recordings you:

  • You record the mmWave radar information through the IAmMuse system, as per usual, letting the code save the recording into frame_recordings.
  • You record the Kinect (ground truth) data in accordance with that project, and save the data to the folder Kinect recordings/.
  • Once both recordings are saved, you run the kinect_mmwave_mover.sh command, passing the name of the recording e.g. ./kinect_mmwave_mover.sh my_beautifull_recording_name.
  • Finally, the recording can be found in user_recordings/.

Note: The system has a few large quirks. First off, the kinect code can only run on windows. Secondly, we use a bash script to move the data. Therefore, we've used wsl to achieve this.

Project structure

All filenames here are from the root of src. Also please note that the term initialization or initializer is often used in the code, where the thesis itself chooses to use the term calibration.

  • main.rs is the entrypoint of the system, it reads in user input and config files, sets up communication channels, and spawns the various processes which make up the program. It may spawn different versions of a process, depending on some flags which are set earlier in the main file.
  • file_reader.rs reads the settings.toml and the fmcw_config.cfg files.
  • constants.rs this is a centralised file which holds several tunable system variables in the CONSTANTS variable. This variable is often imported and refered in various parts of the system
  • tracing_setup.rs contains the code which dictates what severity of messages (debug; info; warn; error) will be printed to the standard out for each part of the program, using the tracing library.
  • types.rs holds many types which see shared use across various systems.
  • fmcw_manager.rs is a wrapper around the raw serial device of the mmWave radar (fmcw), to facilitate interfacing with it programatically, the process passess all raw binary data received from the fmcw into a channel.
  • tlv_translator.rs parses the raw fmcw binary data. It seperates the stream out into several frames, splitting out frame header information from frame data. It delegates the interpretation of specific frames to tlv_frame_reader.rs and sends the received frames, through a channel, one frame at a time.
  • tlv_frame_reader.rs holds multiple functions which decode a single tlv frame.
  • pre_processor.rs against the actual name of the file, the code present here does not only pre-process the data, but it also runs the interpretation processes. The code in this file often calls out to various files in the pre_processor/ directory. Also note that the file does still contain functions from previous experiments. The entance function in new_preprocessing_daemon, the functions which are sure to be important are the ones specified as meb, the torso_remover and arm_localizer functions belong to previous failed approaches, and can be ignored. The file is a bit of a mess, but simply following the execution path starting at new_preprocessing_daemon should be relatively doable and clean.
  • frame_file_interface handles all writing and reading of recordings. For the writing, it timestamps the frames while writing them. For the reading, there are two similar systems, one for reading back only mmWave radar data, the other for reading back both mmWave data and kinect ground truth data. note: The Kinect library used a different timestamping system then IAmMuse (we use unix epoch time, they use dotnet ticks), luckily the translation is a simple case of adding/removing an offset (as can be seen in the function dotnet_ticks_to_unix_millis in types.rs line 181).
  • ipc.rs handles communication with the visualizer, which is a running python process. This is done either through internal unix sockets on linux or through TPC sockets in the case of windows.
  • audio_player.rs is a small piece of code to run audio clips to inform the user of specific required actions during system usage. Often called from the various calibration systems.
  • pc_math.rs is a file with general helper functions for pointclouds used throughout several systems.
  • midi_interface.rs ia a daemon, listening for specific music note requests on a channel, which translates those to more general midi packets which is sends out to other systems to generate actual tones.
  • renderer.rs, fmcw_decoder.rs are deprecate files and can be ignored.

The files in ./pre_processor/ are used in the various stages of the preprocessing:

  • frame_filters.rs is the implementation of the spatio-temporal frame filter.
  • initializables.rs is the general system which manages serializing; deserializing; saving; and loading the various initialization structures.
  • circle_fit.rs calculates the enclosing ball as described in the thesis.
  • hand_weigher.rs simply gives a weight to a point, depending on the point position and the minimal enclosing ball (MEB), which is the data enhancement step in the thesis.
  • angle_to_note.rs is named poorly, as it also does the zone classification (or the interpretation, as called in the thesis), next to also calculating the corresponding note.
  • torso_remover.rs; adaptive_kalman.rs; arm_angle_calculator.rs; arm_location_initializer are code which belong to previously considered interpretation systems.

Misc.

Note that the code will break after running for 28 hours, due to the naming convention for the mmWave point cloud files. (being a 4 digit number). If you plan on running the code for an extended period of time, you may want to extend the number range, used for saving these files.

Be aware that the IAmMuse mmWave data uses unix epoch timestamps, while the kinect uses dotnet ticks as a timestamp, the translation is easy, and can be seen on line 181 of types.rs in the function dotnet_ticks_to_unix_millis.

I am not legally responsible for any usage of my code, and the usage of this code is at your own risk. Not that I think its bad code, but I want to be sure :-) Also, if you extend this code, or use it in your own projects, please do credit this repository :-)

About

This project contains the code used in my thesis, "IAmMuse: A signal-processing-based method to estimate arm positions with mmWave radar".

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published