clip_client.client module#

class clip_client.client.Client(server)[source]#

Bases: object

Create a Clip client object that connects to the Clip server.

Server scheme is in the format of scheme://netloc:port, where
  • scheme: one of grpc, websocket, http, grpcs, websockets, https

  • netloc: the server ip address or hostname

  • port: the public port of the server

Parameters

server (str) – the server URI

encode(content: Iterable[str], *, batch_size: Optional[int] = 'None', show_progress: bool = 'False') np.ndarray[source]#
encode(content: Union[DocumentArray, Iterable[Document]], *, batch_size: Optional[int] = 'None', show_progress: bool = 'False') DocumentArray
profile(content='')[source]#

Profiling a single query’s roundtrip including network and computation latency. Results is summarized in a table.

Parameters

content (Optional[str]) – the content to be sent for profiling. By default it sends an empty Document that helps you understand the network latency.

Return type

Dict[str, float]

Returns

the latency report in a dict.

async aencode(content: Iterator[str], *, batch_size: Optional[int] = 'None', show_progress: bool = 'False') np.ndarray[source]#
async aencode(content: Union[DocumentArray, Iterable[Document]], *, batch_size: Optional[int] = 'None', show_progress: bool = 'False') DocumentArray
rank(docs, **kwargs)[source]#

Rank image-text matches according to the server CLIP model.

Given a Document with nested matches, where the root is image/text and the matches is in another modality, i.e. text/image; this method ranks the matches according to the CLIP model.

Each match now has a new score inside clip_score and matches are sorted descendingly according to this score. More details can be found in: https://github.com/openai/CLIP#usage

Parameters

docs (Iterable[ForwardRef]) – the input Documents

Return type

DocumentArray

Returns

the ranked Documents in a DocumentArray.

async arank(docs, **kwargs)[source]#
Return type

DocumentArray