-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathIntegrations
44 lines (22 loc) · 2.1 KB
/
Integrations
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
TECHNICAL INTEGRATIONS: LouminAIre-PS
Below are the identified intended integrations and direction intended for LoumainARe-PS. Subject to change of course, over time.
Google Input Tools: https://chrome.google.com/webstore/detail/google-input-tools/mclkkofklkfljcocdinagocijmpgbhab/related
FrontEnd Development: https://llvm.org/
Primary Source Coding Languages:
- LangChain
- Python
- Many, many more...
Current and Potential LLM Integrations:
BERT: Introduced by Google in 2018, it features 342 million parameters and was pre-trained on a large corpus of data for query understanding in Google search.
Falcon 40B: Developed by the Technology Innovation Institute, it is a transformer-based, causal decoder-only model available in different parameter sizes and is open source.
Galactica: Meta's LLM trained on academic material, generating AI "hallucinations" that were deemed unsafe due to their authoritative tone.
GPT-3.5: An upgraded version of GPT-3 with fewer parameters, powered ChatGPT, and integrated into Bing search.
GPT-4: Released in 2023, GPT-4 is the largest model in OpenAI's GPT series, potentially close to artificial general intelligence (AGI) capabilities.
LaMDA: Google's LLM family pre-trained on a large text corpus, attracting attention for claims of sentience.
Llama(2): Meta AI's LLM available in various parameter sizes, open source, and built on a transformer architecture. (*likely not eligible due to licensing limitations of multi-LLM interoperation clauses)
Orca: Microsoft's LLM with 13 billion parameters, aiming to imitate reasoning procedures achieved by larger models like GPT-4.
PaLM: Google's 540 billion parameter transformer-based model specializing in reasoning tasks and available in fine-tuned versions.
Phi-1: Microsoft's LLM with 1.3 billion parameters, trained on high-quality data for Python coding.
StableLM: An open-source series of LLMs developed by Stability AI, striving to be transparent and supportive.
Claude(2): Anthopic, a spinoff team and LLM from the OpenAI collective.
- And many many more... (As determined beneficial and needed as LouminAIre-PS evolves over time.)