nexus, index source, nexus main
This page touches on the broad definition of artificial intelligence as forms of autonomous computation. This big region on the AI Euler diagram encompasses all manners and forms of AI: Machine Learning, Deep Learning, Convolutional Neural Networks, Supervised and Unsupervised forms of training, clustering algorithms such as K-means, spectral graph theory, adversarial networks and that’s just getting started.
The related sub-pages linked above – gcp, aws and azure – are AI notes on cloud-specific technology stacks.
Suggestion: Build a prompt using the “PACE” guideline acronym: State Problem, Action, Context, Example.
The central idea in generative AI is the transformer; and the central model hub is Hugging Face.
Let’s run through a HuggingFace use example subject to some constraints and see how the Python
library transformers
comes into play. In particular we want to learn how to operate an instance of
a pipeline
which is imported from transformers
. This is queryable (analogous to requests
)
using default models through an API; so the minimum viable code is down around 2, 3, 4 or so lines.
The source video for this eigenconcept is here on YouTube.
Update the Advanced Package Tool apt
library registry; and permit it to make upgrades of
installed libraries:
sudo apt update
sudo apt upgrade
Ensure that python
, pip
and conda
are installed. Let’s assume pip
will do the job:
First install pytorch
, then transformers
.
pip install torch
pip install transformers
To set up a browser-based interface: Clone the following repository:
git clone https://github.com/oobabooga/text-generation-webui
cd text-generation-webui/
Need to elaborate and set up an environment: MISSING STEP
pip install -r requirements.txt
“Run the application”… ?!?!?!???
./start_linux.sh # On Linux
Example pre-trained model: DistilBERT
from transformers import pipeline
sentiment_pipeline = pipeline("sentiment-analysis")
result = sentiment_pipeline("I love Hugging Face!")
print(result)
More from the AI:
YouTube: Search “Running HuggingFace models on WSL2” or “Using HuggingFace Transformers on Ubuntu” for a visual guide.
DistilBERT: A smaller, faster, and cheaper version of BERT. It is well-suited for tasks like text classification and sentiment analysis 1 .
BERT: A versatile model for various NLP tasks, including question answering and natural language inference 1 .
RoBERTa: An optimized version of BERT that improves performance on NLP tasks 1 .