Qualcomm AI Direct libGenie binding for Node.js
Make libs avilable (details)
- Env
PATHshould able found- Libraries from
<sdk-root>/lib/aarch64-windows-msvcQnnSystem.dllQnnHtp.dllQnnHtpPrepare.dllQnnHtpV*Stub.dll
- And
libQnnHtpV*Skel.so,libqnnhtpv73.catfromlib/hexagon-v*/unsigned
- Libraries from
- Env
ADSP_LIBRARY_PATHshould able foundlibQnnHtpV*Skel.so - Env
LD_LIBRARY_PATHshould able found libraries from<sdk-root>/lib/aarch64-oe-linux-gcc11.2libQnnSystem.solibQnnHtp.solibQnnHtpPrepare.solibQnnHtpV*Stub.so
import { Context, SentenceCode } from 'node-qnn-llm';
const context = await Context.create(/* Genie config object */);
// Or load bundled
// const context = await Context.load({ bundle_path: 'path/to/bundle', unpack_dir: 'path/to/store/unpacked', n_thread?: Number })
await context.query('Hello, world!', (result, sentenceCode) => {
console.log(result);
});
await context.save_session('path/to/session-directory');
await context.restore_session('path/to/session-directory');
await context.set_stop_words(['stop_word1', 'stop_word2']);
await context.apply_sampler_config({
/* Genie sampler config */
});
await context.release();To easier to deploy model, we announced packed file struct.
- Constant entry config path.
- Auto resolve file path.
- Patch config on load.
You can quickly pack your model files use pack.py.
Usage: ./pack.py path/to/config.json
MIT
Built and maintained by BRICKS.