Skip to content

Web interactive machine learning prototype using xmm library and parts of soundworks and wavesjs

Notifications You must be signed in to change notification settings

Ircam-RnD/wiml-prototype

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

25 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

WIML Server

Web Interactive Machine Learning service prototype, based on XMM library, and parts of Collective Soundworks and WavesJS

Requirements :

Compilation & Installation :

  • Install mongo-c-driver : go to your mongo-c-driver local repository, switch to branch r1.3 : git checkout r1.3, then follow instructions in the README to build and install libmongoc and libbson
  • Go into this project's root folder and type npm install to install the dependencies listed in package.json
  • Create a symlink to your xmm local repository in ./ml-tools/dependencies folder (or just drop it there)
  • Go into ./ml-tools/xmm-server-tool, open xmm-server-tool xcode project, then build it (this automatically installs the binary into ./bin)

Usage :

Type npm run mongo from the project's root folder to start the server and mongodb
(does the same as mongod --dbpath ./data/db then npm run watch, but automatically kills mongod when killing node with CTRL-C)

From a smartphone or any other similar device equipped with sensors, visit :
http://YOUR.IP.ADDRESS.HERE:3000/wiml-client
You can now record phrases (sensor data time series), send them to the database tagged with labels, and load the model trained with these phrases from the server in real-time

Notes :

Only tested with OSX 10.11 and Node v0.12 for now

About

Web interactive machine learning prototype using xmm library and parts of soundworks and wavesjs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages