clip_client.client module#
- class clip_client.client.Client(server, credential={}, **kwargs)[source]#
Bases:
objectCreate a Clip client object that connects to the Clip server. Server scheme is in the format of
scheme://netloc:port, wherescheme: one of grpc, websocket, http, grpcs, websockets, https
netloc: the server ip address or hostname
port: the public port of the server
- Parameters
server (
str) – the server URIcredential (
dict) – the credential for authentication{'Authentication': '<token>'}
- profile(content='')[source]#
Profiling a single query’s roundtrip including network and computation latency. Results is summarized in a table. :type content:
Optional[str] :param content: the content to be sent for profiling. By default it sends an empty Documentthat helps you understand the network latency.
- Return type
Dict[str,float]- Returns
the latency report in a dict.
- encode(content: Iterable[str], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100) np.ndarray[source]#
- encode(content: Union[DocumentArray, Iterable[Document]], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100) DocumentArray
- async aencode(content: Iterator[str], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100) np.ndarray[source]#
- async aencode(content: Union[DocumentArray, Iterable[Document]], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100) DocumentArray
- rank(docs, **kwargs)[source]#
Rank image-text matches according to the server CLIP model. Given a Document with nested matches, where the root is image/text and the matches is in another modality, i.e. text/image; this method ranks the matches according to the CLIP model. Each match now has a new score inside
clip_scoreand matches are sorted descendingly according to this score. More details can be found in: https://github.com/openai/CLIP#usage- Parameters
docs (
Union[ForwardRef,Iterable[ForwardRef]]) – the input Documents- Return type
DocumentArray
- Returns
the ranked Documents in a DocumentArray.
- index(content: Iterable[str], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[Dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100)[source]#
- index(content: Union[DocumentArray, Iterable[Document]], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100) DocumentArray
- async aindex(content: Iterator[str], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[Dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100)[source]#
- async aindex(content: Union[DocumentArray, Iterable[Document]], *, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100)
- search(content: Iterable[str], *, limit: int = 10, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[Dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100) DocumentArray[source]#
- search(content: Union[DocumentArray, Iterable[Document]], *, limit: int = 10, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100) DocumentArray
- Return type
DocumentArray
- async asearch(content: Iterator[str], *, limit: int = 10, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[Dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100)[source]#
- async asearch(content: Union[DocumentArray, Iterable[Document]], *, limit: int = 10, batch_size: Optional[int] = None, show_progress: bool = False, parameters: Optional[dict] = None, on_done: Optional[CallbackFnType] = None, on_error: Optional[CallbackFnType] = None, on_always: Optional[CallbackFnType] = None, prefetch: int = 100)