Skip to content

Commit

Permalink
Simpler support for new model libraries (#482)
Browse files Browse the repository at this point in the history
close this [internal
issue](huggingface-internal/moon-landing#8791) (ignore
the unrelated README changes)
  • Loading branch information
julien-c authored Feb 15, 2024
1 parent ea2d471 commit d57fc81
Show file tree
Hide file tree
Showing 7 changed files with 420 additions and 377 deletions.
81 changes: 47 additions & 34 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,34 +10,46 @@
</p>

```ts
// Programatically interact with the Hub

await createRepo({
repo: {type: "model", name: "my-user/nlp-model"},
credentials: {accessToken: HF_TOKEN}
});

await uploadFile({
repo: "my-user/nlp-model",
credentials: {accessToken: HF_TOKEN},
// Can work with native File in browsers
file: {
path: "pytorch_model.bin",
content: new Blob(...)
}
});

// Use hosted inference

await inference.translation({
model: 't5-base',
inputs: 'My name is Wolfgang and I live in Berlin'
})

await hf.translation({
model: "facebook/nllb-200-distilled-600M",
inputs: "how is the weather like in Gaborone",
parameters : {
src_lang: "eng_Latn",
tgt_lang: "sot_Latn"
}
})

await inference.textToImage({
model: 'stabilityai/stable-diffusion-2',
inputs: 'award winning high resolution photo of a giant tortoise/((ladybird)) hybrid, [trending on artstation]',
parameters: {
negative_prompt: 'blurry',
}
})

// and much more…
```

# Hugging Face JS libraries

This is a collection of JS libraries to interact with the Hugging Face API, with TS types included.

- [@huggingface/inference](packages/inference/README.md): Use Inference Endpoints (serverless) to make calls to 100,000+ Machine Learning models
- [@huggingface/inference](packages/inference/README.md): Use Inference Endpoints (serverless or dedicated) to make calls to 100,000+ Machine Learning models
- [@huggingface/hub](packages/hub/README.md): Interact with huggingface.co to create or delete repos and commit / download files
- [@huggingface/agents](packages/agents/README.md): Interact with HF models through a natural language interface

Expand Down Expand Up @@ -130,30 +142,6 @@ await inference.imageToText({
const gpt2 = inference.endpoint('https://xyz.eu-west-1.aws.endpoints.huggingface.cloud/gpt2');
const { generated_text } = await gpt2.textGeneration({inputs: 'The answer to the universe is'});
```
### @huggingface/agents example

```ts
import {HfAgent, LLMFromHub, defaultTools} from '@huggingface/agents';

const HF_TOKEN = "hf_...";

const agent = new HfAgent(
HF_TOKEN,
LLMFromHub(HF_TOKEN),
[...defaultTools]
);


// you can generate the code, inspect it and then run it
const code = await agent.generateCode("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.");
console.log(code);
const messages = await agent.evaluateCode(code)
console.log(messages); // contains the data

// or you can run the code directly, however you can't check that the code is safe to execute this way, use at your own risk.
const messages = await agent.run("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.")
console.log(messages);
```

### @huggingface/hub examples

Expand Down Expand Up @@ -184,6 +172,31 @@ await deleteFiles({
});
```

### @huggingface/agents example

```ts
import {HfAgent, LLMFromHub, defaultTools} from '@huggingface/agents';

const HF_TOKEN = "hf_...";

const agent = new HfAgent(
HF_TOKEN,
LLMFromHub(HF_TOKEN),
[...defaultTools]
);


// you can generate the code, inspect it and then run it
const code = await agent.generateCode("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.");
console.log(code);
const messages = await agent.evaluateCode(code)
console.log(messages); // contains the data

// or you can run the code directly, however you can't check that the code is safe to execute this way, use at your own risk.
const messages = await agent.run("Draw a picture of a cat wearing a top hat. Then caption the picture and read it out loud.")
console.log(messages);
```


There are more features of course, check each library's README!

Expand Down
2 changes: 1 addition & 1 deletion packages/hub/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# 🤗 Hugging Face Hub API

Official utilities to use the Hugging Face hub API, still very experimental.
Official utilities to use the Hugging Face Hub API.

## Install

Expand Down
2 changes: 1 addition & 1 deletion packages/tasks/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ This package contains the definition files (written in Typescript) for the huggi

- **pipeline types** a.k.a. **task types** (used to determine which widget to display on the model page, and which inference API to run)
- **default widget inputs** (when they aren't provided in the model card)
- definitions and UI elements for **third party libraries**.
- definitions and UI elements for **model libraries** (and soon for **dataset libraries**).

Please add to any of those definitions by opening a PR. Thanks 🔥

Expand Down
7 changes: 2 additions & 5 deletions packages/tasks/src/index.ts
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
export { LIBRARY_TASK_MAPPING_EXCLUDING_TRANSFORMERS } from "./library-to-tasks";
export { MODEL_LIBRARIES_UI_ELEMENTS } from "./library-ui-elements";
export { MAPPING_DEFAULT_WIDGET } from "./default-widget-inputs";
export type { TaskData, TaskDemo, TaskDemoEntry, ExampleRepo } from "./tasks";
export * from "./tasks";
Expand All @@ -14,8 +13,8 @@ export {
SUBTASK_TYPES,
PIPELINE_TYPES_SET,
} from "./pipelines";
export { ModelLibrary, ALL_DISPLAY_MODEL_LIBRARY_KEYS } from "./model-libraries";
export type { ModelLibraryKey } from "./model-libraries";
export { ALL_DISPLAY_MODEL_LIBRARY_KEYS, ALL_MODEL_LIBRARY_KEYS, MODEL_LIBRARIES_UI_ELEMENTS } from "./model-libraries";
export type { LibraryUiElement, ModelLibraryKey } from "./model-libraries";
export type { ModelData, TransformersInfo } from "./model-data";
export type {
WidgetExample,
Expand All @@ -41,5 +40,3 @@ export { InferenceDisplayability } from "./model-data";

import * as snippets from "./snippets";
export { snippets };

export type { LibraryUiElement } from "./library-ui-elements";
2 changes: 1 addition & 1 deletion packages/tasks/src/library-to-tasks.ts
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ import type { PipelineType } from "./pipelines";

/**
* Mapping from library name (excluding Transformers) to its supported tasks.
* Inference Endpoints (serverless) should be disabled for all other (library, task) pairs beyond this mapping.
* Inference API (serverless) should be disabled for all other (library, task) pairs beyond this mapping.
* As an exception, we assume Transformers supports all inference tasks.
* This mapping is generated automatically by "python-api-export-tasks" action in huggingface/api-inference-community repo upon merge.
* Ref: https://github.com/huggingface/api-inference-community/pull/158
Expand Down
Loading

0 comments on commit d57fc81

Please sign in to comment.